[gdal-dev] gdal_translate and ArcGIS

Jason Roberts jason.roberts at duke.edu
Tue Mar 2 14:29:29 EST 2010


I am by no means a gdal_translate expert but I do have experience with
ArcGIS, so I can offer a few comments.

ArcInfo ASCII Grid format does not include a specification of the bit depth,
or even whether the file contains integers or floats. When ArcGIS reads the
file, it infers all of this from its contents. Thus, when you say "convert
to a 16 bit AAIGRID", I'm not sure what you mean. If the AAIGRID file
contains integers that range from -32768 to 32767 (or -32767 to 32767 if a
NODATA value is specified), then ArcGIS will usually treat the AAIGRID as
16-bit integers. For example, if you used the ArcGIS ASCII To Raster tool to
create an ArcInfo Binary Grid, it will contain 16-bit integers.

Loading AAIGRID files (.asc files) into ArcMap will typically be very slow
if the files are very large. This is because ArcMap has to parse the entire
file--in effect running the ASCII To Raster tool behind the scenes--in order
to display it.

If you look up the AAIGRID format you will see that AAIGRIDs are
georeferenced using the coordinates of the lower-left corner of the
lower-left cell. I'm not sure about GeoTIFF. As far as I know, GDAL
understands how to properly georeference both GeoTIFF and AAIGRID files. One
problem with AIIGRID is that the format does not include the ability to
specify what spatial reference should be used. Perhaps your match up problem
results from this.

Modern versions of ArcGIS can read GeoTIFF, and may even use GDAL internally
to read it (others on gdal-dev would have a better idea about that than me.)
Is there a reason you are not using the GeoTIFFs directly with ArcGIS?

Hope that helps,


-----Original Message-----
From: gdal-dev-bounces at lists.osgeo.org
[mailto:gdal-dev-bounces at lists.osgeo.org] On Behalf Of Adam Bausch
Sent: Tuesday, March 02, 2010 8:42 AM
To: gdal-dev at lists.osgeo.org
Subject: [gdal-dev] gdal_translate and ArcGIS

I am attempting to take a series of rasters in 32 bit GTiff and use 
gdal_translate to convert then to a 16 bit AAIGRID, however I have been 
running into several difficulties.
Although gdal_translate appears to have the functionality to change 
output data type AND output format in the same line the resultant output 
appears to change only the output format.
Doing this in separate comments:

gdal_translate -ot Int16 -of GTiff input.tif output1.tif
gdal_translate -of of AAIGrid output1.tif tested.asc

Works but using the .asc grid in ArcMap is incredibly slow.

If I back up one step:
If I run the ArcGIS 'copy raster' tool after completing GDAL_translate 
to the Int16 GTiff, the output is an ESRI grid, however, the values do 
not match up.
I am pretty sure this is a problem of the upper left corner to set the 
grid extent in either software being different, but how or can I correct 
for this in gdal before moving over to ArcGIS?  And is there any way to 
easily complete this in any of the gdal utilities on a series of rasters?

(I think gdal takes the upper left corner of the top left pixel, 
however, ArcGIS takes the center location of the uppler left pixel?)

Thank you in advance.

Adam Bausch

gdal-dev mailing list
gdal-dev at lists.osgeo.org

More information about the gdal-dev mailing list