[Gdal-dev] Re: Any hints on increasing performance?

Ed McNierney ed at topozone.com
Wed Apr 14 13:46:18 EDT 2004


Rick -

What I don't quite understand is whether you're filtering out points
that can't possibly be visible.  That is, something like:

For each grid point to evaluate
  For each site
    Examine grid point elevation, site elevation, and distance
      Discard the site if the Earth's curvature and grid/site
      elevations show that the point would not be visible even
      if the Earth were flat.

My question is whether you're doing this or have considered it.
Presumably most of the sites can be quickly rejected for any given grid
point because they can't possibly be visible; if the "sites" are
actually high above the Earth this might not help very much.

	- Ed

Ed McNierney
President and Chief Mapmaker
TopoZone.com / Maps a la carte, Inc.
73 Princeton Street, Suite 305
North Chelmsford, MA  01863
ed at topozone.com
(978) 251-4242 

-----Original Message-----
From: Rick Niles [mailto:fniles at mitre.org] 
Sent: Wednesday, April 14, 2004 12:36 PM
To: gdal-dev at remotesensing.org
Subject: [Gdal-dev] Re: Any hints on increasing performance?

Rick Niles wrote:

>> I've written a program to show line-of-sight coverage of various 
>> radars and aircraft navigation aids such as DME and VOR.  It should 
>> also work for com links as well.  I used the GDal library for reading

>> in the terrain data (DTED Level 1, USGS DEM, or GTOPO30).  The 
>> problem is performance.  To do all of 48-US states it takes about an 
>> hour, which may seem quick, but I wanted to make it a web based 
>> application that has a faster turn-around.
>> 
>> I'm calling GDALRasterIO for one point at a time.  I know that's what

>> everyone is going to say that is the bottleneck, but I tried reading 
>> all the data into arrays and it took 100x longer, because I don't use

>> a large chunk of the data and reading it all in was quite time
consuming,
>> not to mention it used a huge amout of memory.   Also, with the
caching 
>> that GDAL does, it works quite well.  BTW, I coverted the DEM and 
>> GTOPO30 data in GeoTiff format which helped a whole bunch too.
>  
>

> Can you explain a bit more about what pixels you do need?  Is it 
> always a particular subset?  If you only need a subset of the data, 
> then you need to organize your data to support the access strategy.  
> To help you we will need to know more about your access strategy.


Sure, it's quite simple actually:

For each grid point to evaluate
  For each site (usu. about 1000 sites)
    For each point on line connecting the site to the grid point
       Get the elevation and check if we're less than it.

Each grid point represents a user location at a give altitude.  The
curvature of the Earth it very important to include when calculating the
effective altitude along the line.  My original goal was to produce
output on the same scale as the input data (i.e. 3 arc-second), but
since it's so slow I've had to do 1 arc-minute as the default and it's
still takes about an hour to process the whole U.S.

99% of time is taken calculating indexes to reference the data.  So it's
a bit indepdent of if I use GDal cache or read in the data ahead of
time.
Unless I could somehow simpilify the index calculation scheme.

Another algorithm used the concept of sectors from each site and only 12
or some log space points in each sector to evaluate, but I didn't like
that because it didn't use the coordinates of the actual data, but
instead latitude/longitude calculated to fit into this polar view from
each site.
Also, since it only checked 12 or so points a big hill could be missed.


_______________________________________________
Gdal-dev mailing list
Gdal-dev at remotesensing.org
http://remotesensing.org/mailman/listinfo/gdal-dev




More information about the Gdal-dev mailing list