[gdal-dev] Bigtiff
Frank Warmerdam
warmerdam at pobox.com
Thu Jul 1 04:08:16 EDT 2010
On Thu, Jul 1, 2010 at 10:00 AM, Helge Morkemo <hmorkemo at gmail.com> wrote:
> Hi
> I'm creating a huge bigtiff file, it's 169000x151000 pixels (23.77
> Gigapixels, rgb) from .net using WriteRaster.
>
> In the beginning it seemed to be producing output with good performance, but
> it is getting slower and slower.
> The production time seems to be increasing in steps.
> Is there a caching problem for huge files?
>
> The first percent of the file was produced in 2 minutes. Acceptable.
> From 10 % to 11 % took 5 minutes
> From 37 % to 38 % also took 5 minutes
> From 38 % to 39 % took 50! minutes
> From 48 % to 49 % took 46 mminutes
> From 49.4% to 50.4 % (That's where I am right now) took 9 hours, 29 minutes.
>
> Any hints? Am I doing something wrong (I just create the dataset and start
> writing), or is the caching mechanism not suited for huge files?
Helge,
Is the output file using a blocksize which is one scanline? Are you
calling WriteRaster() on scanline sized chunks?
GDAL caches some number of blocks based on the
cache size which can make the first few block writes seem
to go very fast since they are not getting flushed to disk
till later. But that would not relate to your performance
oddities since you are well past the internal caching within
the first few percent of completion.
I really don't know why you are seeing the issue you are
encountering. Is there any sign that the process is getting
very large in memory use - possibly due to some sort of
memory leak?
In a C++ program I'm able to write such sized files without
such an apparent loss of performance.
Best regards,
--
---------------------------------------+--------------------------------------
I set the clouds in motion - turn up | Frank Warmerdam, warmerdam at pobox.com
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush | Geospatial Programmer for Rent
More information about the gdal-dev
mailing list