[gdal-dev] gdal_translate segfault with large netCDF

Josh.Vote at csiro.au Josh.Vote at csiro.au
Tue Apr 19 20:21:13 EDT 2011


Thanks for the suggestions -

> I would like to suggest that you do a gdal_translate from a subset of the
> ERS file at the bottom right corner of the source just to ensure that it
> isn't a problem with reading past the 4GB mark in the ERS file.

I just managed to run 'gdal_translate -of netCDF -srcwin 50000 40000 100 100 input.ers output.nc' without issue (the input ERS dataset is 50592 x 41876)

> Josh,
> As frank said, file a ticket and provide the output of ncdump -h 
> yourfile.nc with the ticket.  I will take a look at it as soon as I can, 
> although I am pretty busy.  Thanks.

I may have misled people with the subject of the message, sorry about that. The issue is translating a large ER Mapper file into an equally large netCDF file (reading .ers and writing .nc)

http://trac.osgeo.org/gdal/ticket/4047

I've attached the ERS dataset header to the issue for reference, please let me know if you need more info.

> What kind of netcdf file is causing the problem? is a netcdf4 or netcdf3 file?
> 
> there is a compiling option in netcdf4  --enable-netcdf-4

Will this affect writing a netCDF? Sorry again if I've misled you, the issue is reading an ER Mapper (.ers) file and writing a NetCDF.

Thanks again everyone for your assistance.
Josh Vote


More information about the gdal-dev mailing list