[gdal-dev] problem with decimal precision (SAGA driver)
Volker Wichmann
Volker.Wichmann at uibk.ac.at
Mon Apr 4 09:11:21 EDT 2011
Hi,
we encountered a problem with decimal precision when converting a SAGA
grid to ESRI ASCII.
I expect the problem arising from ASCII to double conversion by atof()
and/or from the Geotransform (pixel as point to pixel as area).
The SAGA header looks like
POSITION_XMIN = 12181.8000000000
POSITION_YMIN = 219184.8800000000
CELLSIZE = 1.0000000000
The final ESRI ASCII header looks like
xllcorner 12181.299999999999
yllcorner 219184.380000000005
instead of
xllcorner 12181.3
yllcorner 219184.38
How is this problem handled in other drivers (using strtod(),
stringstream, ...)?
The relevant lines of code in sagadataset.cpp
(http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp)
are:
406
<http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L406>
dXmin = atof(papszTokens[1]);
408
<http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L408>
dYmin = atof(papszTokens[1]);
410
<http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L410>
dCellsize = atof(papszTokens[1]);
601
<http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L601>
padfGeoTransform[0] = poGRB->m_Xmin - poGRB->m_Cellsize / 2;
602
<http://trac.osgeo.org/gdal/browser/trunk/gdal/frmts/saga/sagadataset.cpp#L602>
padfGeoTransform[3] = poGRB->m_Ymin + (nRasterYSize - 1)
* poGRB->m_Cellsize + poGRB->m_Cellsize / 2;
Thanks,
Volker
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.osgeo.org/pipermail/gdal-dev/attachments/20110404/781d7f71/attachment.html
More information about the gdal-dev
mailing list