[gdal-dev] poBand->ComputeStatistics() returning different values than ArcMap for .tif file

Even Rouault even.rouault at spatialys.com
Mon Mar 16 13:53:02 PDT 2015


Selon Michael Katz - NOAA Affiliate <michael.katz at noaa.gov>:

> I have a 2GB .tif file. It is 16 bit, and I know the values range from 0 to
> 2047 (i.e., 11 bits are used in the values). It reports three bands, RGB,
> and when I use poBand->ComputeStatistics() on each band I get:
>
> red band (first): min: 0, max: 1953
> green band (second): min: 0, max: 2047
> blue band (third): min: 0, max: 1991
>
> But when I open the same file in ArcMap, get the Layer Properties, look at
> the Symbology tab, set the stretch type to Minimum-Maximum, uncheck Apply
> Gamma Stretch, and click the Histograms button, it shows the following
> values:
>
> Red: min: 60, max: 1547
> Green: min: 230, max: 2047
> Blue: min: 208, max: 1326
>
> It is behaving as if (perhaps) ArcMap is throwing out a certain percentage
> of outlier low and high values when reporting the statistics. But I have
> Minimum-Maximum set as the stretch type, not Percentage Clip, and in any
> event, my understanding is that the chosen stretch type should not affect
> the statistics shown on the Histograms dialog box; that those values are
> the raw values in the file.
>
> Do you have any idea why GDAL would be reporting different min/max values
> for the bands? Any ideas on how I can debug it?

Michael,

No idea what ArcGIS can do in that area, but GDAL should compute its statistics
based on all pixels. Unless the dataset has overviews, in which case if you
compute the statistics in approximate mode, it can use overviews and thus report
different values than the full resolution image.
You could grab all the pixel values by yourself with RasterIO() API and do the
min/max to check, but I've no reason to believe GDAL stats are broken.

Even

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com


More information about the gdal-dev mailing list