[OpenDroneMap-dev] Ideas for speeding up the Point Matching

Alex Mandel tech_dev at wildintellect.com
Mon Mar 16 13:32:01 PDT 2015


On 03/16/2015 01:31 PM, Alex Mandel wrote:
> I think the answer to that is yes. Which is fine for data that is
> relatively evenly scattered across the landscape. It works better for
> more linear coverage, the closer to square the worse the split gets, but
> I'm thinking we can pick a length:width ratio at which we divide the
> area not just into slices but into tiles, so instead of 10x1 rectangles
> do 2x5 tiles. It makes for calculating more overlaps but ensures good
> coverage.
> 
> I've been contemplating other methods like density kernels or distance
> matrixes, but those are all computationally intensive enough to think it
> won't help, and then deciding on overlaps between clumps is tricky. Even
> coming up with regions by additive nature until you reach 100 images
> seems like it could take a while instead of just dividing the region evenly.
> 
> I see these ideas falling apart when you have the majority of data
> lumped to one side. So maybe we do a 90% or 95% Min convex polygon, to
> ignore outliers when coming up with the BBOX, divide into into tiles and
> then expand the outside limits to grab the 5-10% outliers.
> 
> Thanks,
> Alex
> 
> On 03/16/2015 10:13 AM, Stephen Mather wrote:
>> Ok -- this is deep. So this approach reduces the factorial load
>> independently of geography, with the exception that the subset locations
>> are chosen by geography?
>>
>> Thanks,
>> Best,
>> Steve
>>
>>
>>
>> On Sun, Mar 15, 2015 at 11:00 PM, Alex Mandel <tech_dev at wildintellect.com>
>> wrote:
>>
>>> So 2 days in to running ~1000 images I started pondering how to speed up
>>> the point matching step.
>>>
>>> Here's the basic idea, split the list of images into related chunks,
>>> process those chunks and then process a slice of overlap from 2 adjacent
>>> chunks.
>>>
>>> This nice part is that this idea will work for either time/numeric
>>> ordered or GeoExif grouping.
>>>
>>> Here's the basic idea, take a number of photos that is reasonable to
>>> process like 100. Now take your set of photos, 1000 and divide it by
>>> 100. So you first do 10 sets of a 100. Then you take 1/2 or 1/3 or 1/4
>>> of two adjacent sets. So the last 50 of set 1 and first 50 of set 2 (for
>>> 1/2), that gives you 9 additional sets of 100 but you only have to do
>>> 1/2 of the matches, only the ones that have a photo from a different
>>> original set.
>>>
>>> Here's the math:
>>> n!/(r!(n-r)!)
>>>
>>> http://www.calculatorsoup.com/calculators/discretemathematics/combinations.php
>>>
>>> n is number of photos in a set
>>> r is 2, number of photos in a pair
>>>
>>> 100 Photos is 4950 pairs. If you do just 1000 you get 499,500 pairs to
>>> check. If you do what I said above, groups of 100, with 50% overlap you
>>> get 71,775 pairs to check ((10*4950)+9(4950/2))
>>>
>>> Now for the fancy part, the dumb application of this to GeoExif is to
>>> take the bbox of all the photos, take the longer side of the box and
>>> divide it by 10, so you make 10 new rectangles and run each of those as
>>> a set, then run the 9 overlap regions. Long term we can get smarter with
>>> how we pick the regions to ensure ~100 images per region.
>>>
>>> So steps to accomplish this,
>>> 1. write a short script to make lists of photos in each set to run.
>>> 2. figure out how to send the lists to the point matching tool
>>> 2a. figure out how to make the point matching tool skip matches done in
>>> a previous set.
>>> 3. Continue from there as usual.
>>>
>>> Feedback anyone?
>>>
>>> Thanks,
>>> Alex
>>> _______________________________________________
>>> OpenDroneMap-dev mailing list
>>> OpenDroneMap-dev at lists.osgeo.org
>>> http://lists.osgeo.org/cgi-bin/mailman/listinfo/opendronemap-dev
>>>
>>
>>
>>
>> _______________________________________________
>> OpenDroneMap-dev mailing list
>> OpenDroneMap-dev at lists.osgeo.org
>> http://lists.osgeo.org/cgi-bin/mailman/listinfo/opendronemap-dev
>>
> 
> 



More information about the OpenDroneMap-dev mailing list