[gdal-dev] Raster size and ReadAsArray()
Alexander Bruy
alexander.bruy at gmail.com
Wed Aug 3 13:45:44 EDT 2011
Hi
2011/8/3 Antonio Valentino <antonio.valentino at tiscali.it>:
> In my experience using too large chunks of memory can cause,
> paradoxically, slowdowns.
> My suggestion is to define a reasonable maximum size for arrays in your
> application/library and switch to the "fallback algorithm for large
> rasters" every time that MAX_SIZE is exceeded, even if using ReadAsArray
> still works.
Thanks for suggestion, I already think about this.
2011/8/3 Even Rouault <even.rouault at mines-paris.org>:
> It is difficult to predict in all cases. But there are a few things to keep in
> mind. Do you use a 32 bit build of GDAL or a 64 bit one ? If it is a 32 bit,
> then your memory allocations are generally limited to 2 GB by process, and
> much of the time even less because it is difficult to get reliably a continuous
> area of virtual memory of such a size. 53109 x 29049 = 1.5 GB if your data is
> of type Byte and if your dataset has only one band (multiply by the number of
> bands and the size in byte of the band data type). With a 64 bit build of GDAL
> and enough RAM, that should work more reliably.
Personally I use 32-bit system and 32-bit GDAL, but I want to make my tool
flexible as much as possible. This is why I look for method to get maximum
raster size that fits into memory limits.
>> Currently I try to implement it with try-except statement but maybe
>> there is more
>> elegant solution?
>
> Yes, work with smaller buffers...
Thanks again for all yours suggestions guys. Will try to implement this.
Bye
--
Alexander Bruy
More information about the gdal-dev
mailing list