[gdal-dev] poBand->ComputeStatistics() returning different values than ArcMap for .tif file

Michael Katz - NOAA Affiliate michael.katz at noaa.gov
Mon Mar 16 12:40:50 PDT 2015


I have a 2GB .tif file. It is 16 bit, and I know the values range from 0 to
2047 (i.e., 11 bits are used in the values). It reports three bands, RGB,
and when I use poBand->ComputeStatistics() on each band I get:

red band (first): min: 0, max: 1953
green band (second): min: 0, max: 2047
blue band (third): min: 0, max: 1991

But when I open the same file in ArcMap, get the Layer Properties, look at
the Symbology tab, set the stretch type to Minimum-Maximum, uncheck Apply
Gamma Stretch, and click the Histograms button, it shows the following
values:

Red: min: 60, max: 1547
Green: min: 230, max: 2047
Blue: min: 208, max: 1326

It is behaving as if (perhaps) ArcMap is throwing out a certain percentage
of outlier low and high values when reporting the statistics. But I have
Minimum-Maximum set as the stretch type, not Percentage Clip, and in any
event, my understanding is that the chosen stretch type should not affect
the statistics shown on the Histograms dialog box; that those values are
the raw values in the file.

Do you have any idea why GDAL would be reporting different min/max values
for the bands? Any ideas on how I can debug it?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/gdal-dev/attachments/20150316/8cd5e17e/attachment.html>


More information about the gdal-dev mailing list