[gdal-dev] How create a simplified raster from an uge raster
dataset
Andrea Peri
aperi2007 at gmail.com
Mon Apr 30 12:31:02 EDT 2012
Hi Rouault ,
thx for response.
>Note that if you want to use more than 2 gigabytes of RAM, you need a
64-bit
>build of GDAL.
I'm use gdal compiled to 64 bit and has 8GB ram, so I can try to use 4GB as
cachemax)
I change my settings and try this configuration:
gdaladdo -r average -ro --config BIGTIFF_OVERVIEW YES --config
GDAL_CACHEMAX 4096 catalog.vrt 4 16 64 256 1024 4096
>I'm also skeptical on the fact that computing such huge overview levels
will
>lead to a visually acceptable result. Sometimes you want to use other
source of
>data than the high resolution data when displaying low resolution
overviews.
Are you guessing that was better to apply before a gdalwarp to create a new
set of starting raster with a lower resolution and only after
calculate the overview from this low resolution set of rasters ?
Andrea.
2012/4/30 Even Rouault <even.rouault at mines-paris.org>
> Selon Andrea Peri <aperi2007 at gmail.com>:
>
> > Hi,
> > I have a huge set of tiffs rasters.
> > Any single tiff is a geotiff tiled (256x256) and as an internal overview.
> >
> > To can use this raster dataset to lower scales (full extension) I need to
> > create a single simplified tiff raster of all this dataset.
> > Now I need to create a single simplified tiff raster .
> >
> > I don't know how is the better way to produce this simplified raster.
> > This because the original huge set of rasters are about 2 TB. Every
> raster
> > is about 15000 x 15000 px.
> >
> > SO I start creating a catalog named "cat_fogli.vrt" using gdalbuildvrt.
> > And after this was create, I tryed to produce an external overview of
> this
> > catalog.
> > using this command:
> >
> > gdaladdo -r average -ro --config BIGTIFF_OVERVIEW YES cat_fogli.vrt 256
> 512
> > 1024 2048 4096
> >
> > Unfortunately gdaladdo seem to be lock. It is take really many time and
> is
> > always on 0%.
> >
> > I don't know if is only a bug time due to the huge size of original
> raster
> > dataset or I'm using the wrong way.
>
> I might not be surprised that it would really take a lot of time. You
> might want
> to increase significatively the block cache to hopefully speed up things.
> You
> can do that by specifying --config GDAL_CACHEMAX some_value_in_megabytes.
> See
> http://trac.osgeo.org/gdal/wiki/ConfigOptions#GDAL_CACHEMAX for more
> details.
> Note that if you want to use more than 2 gigabytes of RAM, you need a
> 64-bit
> build of GDAL.
>
> Another factor which might explain the slowness is the value of 256 as the
> first
> overview level, which is really a huge value. Computing a block of 256x256
> of
> that level requires processing 65536x65536 pixels of the base image ! So I
> would
> recommand specifying much lower values before reaching 256. For example 4
> 16 64.
>
> I'm also skeptical on the fact that computing such huge overview levels
> will
> lead to a visually acceptable result. Sometimes you want to use other
> source of
> data than the high resolution data when displaying low resolution
> overviews.
>
> (Ah, and using gdaltindex will not help for your use case. gdaltindex
> output is
> only used by MapServer AFAIK.)
>
> Best regards,
>
> Even
>
--
-----------------
Andrea Peri
. . . . . . . . .
qwerty àèìòù
-----------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.osgeo.org/pipermail/gdal-dev/attachments/20120430/04ecc4d2/attachment.html
More information about the gdal-dev
mailing list