[Mapserver-users] Large Raster files
j.l.h.hartmann at uva.nl
Mon May 17 09:46:55 EDT 2004
IMD Listuser wrote:
> I have several large (250MB) images that I would like to serve using
> mapserver. They are currently Geotiffs, so I would prefer to use them in
> the same format. Nonetheless I would be happy to hear about the best
> strategies for serving imagery.
> In particular, what is the best method for splitting up such an image
> into the smaller tiles that can be indexed using gdaltindex?
I use the following rule of thumb: I partition large raster files into
tiles of 2000*2000 and build an index on them with gdaltindex.
Partitioning can be done with GRASS or gdal_translate -srcwin. I also
make downsampled copies of the original raster, resampling it by factors
of two, also with GRASS or with gdalwarp -tr. If he resulting image
itself is larger than 2000*2000 I tile that too, and so on. A
40000*40000 image would result in 20*20=400 tiles; it would be resampled
to 20000*20000 pixels (partitioned into 10*10=100 tiles in its turn),
and so on. The 40000, 20000, 10000, 5000 etc tiled images can be shown
at their appropriate scales by the MAXSCALE-MINSCALE keyword.
I'm not sure how "optimal" this procedure is; it works nicely for the
Amsterdam maps, but I'm sure it can be optimized. My impression it that
it generates *much* more efficient web sites than advanced compressing
techniques like ECW. The factoring by powers of two makes it very easy
to script the procedure, so large numbers of big rasters can be
processed without much programming. This way, the original eighteen
Amsterdam maps (1.5G) resulted into about 10000 tiles, requiring about
three times the original disk space.
Perhaps people with really big databases could comment on this?
More information about the mapserver-users