[gdal-dev] Re: Strange things with gdalwarp ...
peifer at gmx.eu
Mon Aug 24 08:25:12 EDT 2009
Adam Nowacki wrote:
> Some rather counterintuitive gdalwarp behavior: the bigger
> dfWarpMemoryLimit (-wm setting) the more cpu time will be wasted on
> warping not existing pixels. Why? Warping begins with the destination
> window size of entire output image size. If this size is larger than
> dfWarpMemoryLimit it is split in half along the longest axis and any
> half that doesn't contain the currently processed source file is
> discarded. With large dfWarpMemoryLimit this subdivision process will
> stop early with still large portions of out of source image pixels.
Given Adam's explanations above, could someone tell me if my below
assumptions are correct? Thanks.
Just to repeat, the input files are ASTER GTiffs in WGS84, 3601x3601
pixels each. The output is a single mosaic of all tiles, a 100m GTiff,
in LAEA projection
gdalwarp -wm 300 --config GDAL_CACHEMAX 300 --debug on ... reports:
I understand this as follows: the source image of 3601x3601 pixels is
read as one chunk (which after reprojection and resampling should be
around 800x1100 pixels). However, 12125x10375 pixels are actually
written to the output file. So 99.5% of the destination window must be
completely unrelated to my input image.
If I counter-intuitively reduce the memory limit to, say: 40 MB and
leave anything else unchanged, then gdalwarp behaviour changes to
This tells me that the destination window is much smaller and gdalwarp
will not waste time to write out so many irrelevant pixels, correct?
Perhaps gdalwarp should not only test if the destination window fits
into memory, but also check what would be the minimum destination window
for warping the input image. This could speed up the mosaicing of small
tiles into a bigger output file.
More information about the gdal-dev