[gdal-dev] RE: GDALOpen crash on large GRIB files - workaround
Frank Warmerdam
warmerdam at pobox.com
Fri Dec 11 09:55:12 EST 2009
Sjur Kolberg wrote:
>
> Hi again,
>
> This was obviously not a known problem. What we ended up with was to
> split the GRIB files up in smaller pieces using another software and
> importing bit by bit using GDAL. During the work we also had memory
> allocation problems in plain C++ code (myarray = new float[alotofcells];
> gives "Out of memory" exception), so it seems that it is Windows that is
> not too happy with allocating large memory chunks. This amount of memory
> space, I guess, is most often needed by meteorological formats like
> grib, NetCDF or HDF, which are not too extensively used on Windows.
Sjur,
While I have found 32bit windows is generally unable to take advantage of
more than about 1GB of RAM for a single process it did not sound like you
were running into this. So I find the situation still somewhat odd.
> Does GDAL, or the drivers of often-large files, offer any possibility to
> restrict the subset of bands before calling GDALOpen() on the file?
Generally speaking no.
However, the GRIB driver could theoretically be restructured to support
using subdatasets to treat individual bands or ranges of bands as
standalone datasets which would presumably help. However, there does
seem to be a mismatch between the GDAL data model and that of the essentially
3+ dimensional grib file you are working with.
Best regards,
--
---------------------------------------+--------------------------------------
I set the clouds in motion - turn up | Frank Warmerdam, warmerdam at pobox.com
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush | Geospatial Programmer for Rent
More information about the gdal-dev
mailing list