[Gdal-dev] Re: Any hints on increasing performance?

Rick Niles fniles at mitre.org
Wed Apr 14 12:35:40 EDT 2004


Rick Niles wrote:

>> I've written a program to show line-of-sight coverage of various radars 
>> and aircraft navigation aids such as DME and VOR.  It should also work 
>> for com links as well.  I used the GDal library for reading in the 
>> terrain data (DTED Level 1, USGS DEM, or GTOPO30).  The problem is 
>> performance.  To do all of 48-US states it takes about an hour, which 
>> may seem quick, but I wanted to make it a web based application that has 
>> a faster turn-around.
>> 
>> I'm calling GDALRasterIO for one point at a time.  I know that's what 
>> everyone is going to say that is the bottleneck, but I tried reading all 
>> the data into arrays and it took 100x longer, because I don't use a 
>> large chunk of the data and reading it all in was quite time consuming, 
>> not to mention it used a huge amout of memory.   Also, with the caching 
>> that GDAL does, it works quite well.  BTW, I coverted the DEM and 
>> GTOPO30 data in GeoTiff format which helped a whole bunch too.
>  
>

> Can you explain a bit more about what pixels you do need?  Is it always
> a particular subset?  If you only need a subset of the data, then you need
> to organize your data to support the access strategy.  To help you we will
> need to know more about your access strategy.


Sure, it's quite simple actually:

For each grid point to evaluate
  For each site (usu. about 1000 sites)
    For each point on line connecting the site to the grid point
       Get the elevation and check if we're less than it.

Each grid point represents a user location at a give altitude.  The curvature of the Earth it very important
to include when calculating the effective altitude along the line.  My original goal was to produce output on the
same scale as the input data (i.e. 3 arc-second), but since it's so slow I've had to do 1 arc-minute as the default
and it's still takes about an hour to process the whole U.S.

99% of time is taken calculating indexes to reference the data.  So it's a bit indepdent of if I use GDal cache or read in the data ahead of time.
Unless I could somehow simpilify the index calculation scheme.

Another algorithm used the concept of sectors from each site and only 12 or some log space
points in each sector to evaluate, but I didn't like that because it didn't use the coordinates of
the actual data, but instead latitude/longitude calculated to fit into this polar view from each site.
Also, since it only checked 12 or so points a big hill could be missed.





More information about the Gdal-dev mailing list