[gdal-dev] Quick question about gdal2tiles

Even Rouault even.rouault at spatialys.com
Wed Jan 28 15:27:36 PST 2015


Le jeudi 29 janvier 2015 00:05:32, Jorge Arévalo a écrit :
> > Even Rouault <mailto:even.rouault at spatialys.com>
> > January 28, 2015 at 9:28 PM
> > 
> > Le mercredi 28 janvier 2015 20:17:08, Jorge Arévalo a écrit :
> >> Hi,
> >> 
> >> I'm working with a patched version of gdal2tiles, which makes use of
> >> parallelization:
> >> http://gis.stackexchange.com/questions/7743/performance-of-google-map-ti
> >> le- creation-processes/74446#74446
> >> 
> >> I want to create a complete TMS cache from raster imagery. No
> >> assumptions about SRS of data type for input data.
> >> 
> >> I want the tiling process to be as fast as possible (gdal2tiles is
> >> really heavy process), do you have any recomendations about the data or
> >> the parameters used?
> >> 
> >> Right now, I'm doing this
> >> 
> >> Step 1: build vrt from input images
> >> 
> >> gdal_vrtmerge.py -o merged.vrt<list of input tif files>
> >> 
> >> Step 2: build tiles from vrt file
> >> 
> >> gdal2tiles.py -r cubic -s epsg:XXXX -z 0-19 -w all merged.vrt tms_dir
> >> 
> >> Even with parallelization, process still feels really slow. Would it be
> >> faster if, for example, I convert all my input files to epsg:3857? Or if
> >> I scale them to 8-bit? Or if I use near resampling method instead of
> >> cubic? (I'm using cubic because I'm working with continuous data:
> >> satellite images, am I doing it right?).
> >> 
> >  From a quick look at the source, it seems that there's an optimization
> >  if the
> > 
> > input SRS == output SRS that avoids going through the warped VRT path.
> > 
> > That said, we definitely need one or several maintainers for gdal2tiles.
> > There are quite a faw patches floating around in Trac that would need
> > someone to review, test, fix, apply them, as well as writing tests (no
> > autotest for gdal2tiles yet), etc...
> 
> Ok. But the applications is taking hours to generate a complete tile
> cache (zoom levels 0-19) for a 3MB tiff file, in epsg:3857. A 4 cores
> machine with 8GB of RAM. The file is this one
> 
> https://dl.dropboxusercontent.com/u/6599273/gis_data/katrina-3857.tif
> 
> Taking so much time for a 3MB file sounds ridiculous. I'm probably doing
> something wrong. This is the line
> 
> gdal2tiles.py  -s epsg:3857 -z 0-19 -r cubic katrina-3857.tif tiles_dir
> 
> Do you see something wrong in this approach?

The performance seems to be deeply impacted by "-r cubic", but even without 
it, it still sucks. The reason is the zoom level 19. Your dataset has a 
resolution of 3469 m. zoom 19 corresponds to a resolution of ~ 0.3 m. So you 
basically try to generate a TMS that is rougly 12000 x 12000 larger than your 
source dataset...

> 
> Anyway, if this is a problem of gdal2tiles and it needs fine tunning or
> maintenance, we could talk. I don't know if there's any other method to
> generate a complete TMS cache using GDAL.
> 
> >> Any other tips?
> >> 
> >> Many thanks in advance
> >> 
> >> Best regards

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com


More information about the gdal-dev mailing list