[gdal-dev] Processing huge DEM dataset

Marcos Dione mdione at grulic.org.ar
Thu Apr 23 02:28:07 PDT 2015


    I have SRTM's DEM 1x1 degree 30m resolution tiles for the whole Europe and I'm trying to generate several raster images based on 
that (elevation coloring, slopeshade and hillshade), but I'm not sure about the right approach to do it for that amount of data.

    The simplest approach is to stitch the DEMs and then process, but that takes ages, specially if I try to use uncompressed, tiled 
GeoTIFFs as output. The stitching can even be done using a virtual file, which saves space.

    If I process each tile individually, and then build a virtual file on top, I get shades on the edges of tiles. This shade is due 
to the tile ending and the shading algorithm assuming there's a 0 elevation point right to it. So, question A) is that so? 

    I think that getting the shade in the output is due to the algorithm for finding a pixel uses the first tile that has it. 
Question B) is that so?

    If so, C) could I simply avoid this by generating another vrt file that lists each tile as having a bbox of only the 1x1 degree 
instead of the 1x1 degree plus an extra pixel border? If I get the time, I'll try this this afternoon (I just thought of it).

    All this I can do more or less with the gdal command line tools, without much programming. Then comes a more programmatic way: 
either use gdaldem or use the GDAL API to process each tile, then cut the 1x1 degree image and save that; then stitch them/build a 
vrt file on top. As you can see, this is what I've been avoiding to do :)

    Finally, I would also like to generate contour lines for this. So far I managed to generate them for 5x5 tiles with 90m 
resolution, then I import them in postgis. When I render them, on the edges of such tiles I see the lines from one crossing the 
others, looking ugly. For instance:

http://grulicueva.homenet.org/~mdione/Elevation/#14/45.0000/15.0000

    I tried used a stitched file for the whole region but I ran out of memory with gdal_contour. Again, this was with 90m resolution 
tiles; now I have 30m, which means 9 times as much data. D) How could I properly process all that?

    Thanks in advance for any ideas,

        -- Marcos.


More information about the gdal-dev mailing list