<font face="courier new,monospace">Josh,<br>As frank said, file a ticket and provide the output of ncdump -h <a href="http://yourfile.nc">yourfile.nc</a> with the ticket. I will take a look at it as soon as I can, although I am pretty busy. Thanks.<br>
<br>kss<br clear="all"></font><br>/**<br> *<br> * Kyle Shannon<br> * <a href="mailto:ksshannon@gmail.com" target="_blank">ksshannon@gmail.com</a><br> *<br> */<br><br><br>
<br><br><div class="gmail_quote">On Tue, Apr 19, 2011 at 09:57, Frank Warmerdam <span dir="ltr"><<a href="mailto:warmerdam@pobox.com">warmerdam@pobox.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<div class="im">On 11-04-19 05:01 AM, Josh.Vote@csiro.au wrote:<br>
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
Hi,<br>
<br>
I’m new to GDAL so please forgive any glaring ignorance J<br>
<br>
Currently I have an 8GB ER Mapper (ERS) dataset that I want to convert to a<br>
NetCDF file with gdal_translate which always results in a segfault when using<br>
the following command.<br>
<br>
gdal_translate -of netCDF input.ers <a href="http://output.nc" target="_blank">output.nc</a><br>
<br>
Whereas translating only a small <4B subset of the dataset works fine.<br>
<br>
Now I’ve been doing a bit of reading and I know that netcdf 3.6.2 and below<br>
doesn’t support variables greater than 4GB however I’ve been doing the<br>
translations with the Debian libnetcdf6 package (which I believe includes<br>
netCDF 4.1.1 and running ‘ncgen –version’ confirms this). I am operating under<br>
the impression that netCDF 4.1.1 should be able to handle netCDF files of this<br>
size without trouble.<br>
<br>
Now I’ve tested gdal_translate from a variety of builds and they all produce<br>
the same problem<br>
</blockquote>
<br></div>
Josh,<br>
<br>
I don't see any immediately obvious problems in the way we call the netcdf<br>
API in netcdfdataset.cpp's CreateCopy method. I would suggest you file a<br>
ticket on the issue, and a developer can try to reproduce the problem.<br>
<br>
I would like to suggest that you do a gdal_translate from a subset of the<br>
ERS file at the bottom right corner of the source just to ensure that it<br>
isn't a problem with reading past the 4GB mark in the ERS file.<br>
<br>
I seem to have 3.6.3 of the netcdf library so I can't trivially check<br>
this on my system.<br>
<br>
Best regards,<br>
-- <br>
---------------------------------------+--------------------------------------<br>
I set the clouds in motion - turn up | Frank Warmerdam, <a href="mailto:warmerdam@pobox.com" target="_blank">warmerdam@pobox.com</a><br>
light and sound - activate the windows | <a href="http://pobox.com/%7Ewarmerdam" target="_blank">http://pobox.com/~warmerdam</a><br>
and watch the world go round - Rush | Geospatial Programmer for Rent<br>
<br>
_______________________________________________<br>
gdal-dev mailing list<br>
<a href="mailto:gdal-dev@lists.osgeo.org" target="_blank">gdal-dev@lists.osgeo.org</a><br>
<a href="http://lists.osgeo.org/mailman/listinfo/gdal-dev" target="_blank">http://lists.osgeo.org/mailman/listinfo/gdal-dev</a><br>
</blockquote></div><br>