[gdal-dev] Gdalinfo slow with big Rasterlite tables

Even Rouault even.rouault at mines-paris.org
Sun Aug 19 14:36:44 PDT 2012


> Yes, that makes gdalinfo fast. With my biggest layer the time went down
> from 3 minutes to 3 seconds. However, my gdal_translate test fails. It
> used to take three minutes before the zero appeared into the progress bar
> but after that translation itself took only few seconds.  After updating
> to GDAL r24803 program shows the zero percent progress within couple of
> seconds but unfortunately nothing happens in any reasonable time after
> that.

Is the reasonable time more than 3 minutes ?

> 
> >gdal_translate -of Gtiff -outsize 1% 1% RASTERLITE:test.sqlite,table=t0080
> >test.tif
> 
> Input file size is 153600, 249600
> 0
> 
> Overviews are ok so taking the one percent downsampes should be fast.
>   Overviews: 76800x124800, 38400x62400, 19200x31200, 9600x15600, 4800x7800,
> 2400x3900, 1200x1950, 600x975, 300x488

I've tested the gdal_translate -outsize 1% 1% with a 104740 x 49510 raster and 
it works fast for me. Could you add --debug on to the above gdal_translate and 
report what it outputs ?

And also, to confirm that it is due to the changes in sqlite driver and not the 
ones in the rasterlite driver, you could retry but after having deleted the 
statistics :

ogrinfo test.sqlite -sql "DELETE FROM layer_statistics"

and then again gdal_translate --debug on [...]

> 
> -Jukka-
> _______________________________________________
> gdal-dev mailing list
> gdal-dev at lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/gdal-dev


More information about the gdal-dev mailing list