[gdal-dev] poBand->ComputeStatistics() returning different values than ArcMap for .tif file

Michael Katz - NOAA Affiliate michael.katz at noaa.gov
Mon Mar 16 14:52:52 PDT 2015


Trent, yes, this was key info.

When I open the Catalog panel in ArcMap and tell it to compute statistics
for the file, it comes back with the same statistics as GDAL gives. So that
part of the mystery is resolved.

What is confusing is that, for this particular image, whatever ArcMap is
doing to compute those quick statistics (the ones where, for instance, the
blue ranges from 208 to 1326 instead of the actual 0 to 1991), works out
well in terms of actually displaying the image. That is, if I map each
band's true value range into 0-255 for display, I get a picture that is
strongly tinted green. But if I use the quick ranges, that brings up the
red and blue (because their quick max is so much lower than the 2047) and
the picture ends up looking correct/realistic.

So, maybe this is just a coincidence, and in this particular case what
ArcMap happens to be doing for its quick statistics turns out to give about
the same effect as a (0.25%) percentage clip, which also gives a realistic
looking image.

What I would *like* is a quick way to find those reasonable percentage-clip
cutoffs. But for now I'll assume this was just a coincidence and I'll do a
scan of all the data to try to determine the (0.25%) percentage-clip
cutoffs as quickly as I can.


On Mon, Mar 16, 2015 at 1:57 PM, thare at usgs.gov <thare at usgs.gov> wrote:

> This might not matter in your case but in current versions of Arcmap it is
> common to only estimate the stats. I now always calculate stats per image
> prior to loading into Arcmap (in catalog).
>
> Let me know if that helps,
> Trent
>
> ----- Reply message -----
> From: "Michael Katz - NOAA Affiliate" <michael.katz at noaa.gov>
> To: "gdal-dev" <gdal-dev at lists.osgeo.org>
> Subject: [gdal-dev] poBand->ComputeStatistics() returning different values
> than ArcMap for .tif file
> Date: Mon, Mar 16, 2015 2:40 PM
>
> I have a 2GB .tif file. It is 16 bit, and I know the values range from 0
> to 2047 (i.e., 11 bits are used in the values). It reports three bands,
> RGB, and when I use poBand->ComputeStatistics() on each band I get:
>
> red band (first): min: 0, max: 1953
> green band (second): min: 0, max: 2047
> blue band (third): min: 0, max: 1991
>
> But when I open the same file in ArcMap, get the Layer Properties, look at
> the Symbology tab, set the stretch type to Minimum-Maximum, uncheck Apply
> Gamma Stretch, and click the Histograms button, it shows the following
> values:
>
> Red: min: 60, max: 1547
> Green: min: 230, max: 2047
> Blue: min: 208, max: 1326
>
> It is behaving as if (perhaps) ArcMap is throwing out a certain percentage
> of outlier low and high values when reporting the statistics. But I have
> Minimum-Maximum set as the stretch type, not Percentage Clip, and in any
> event, my understanding is that the chosen stretch type should not affect
> the statistics shown on the Histograms dialog box; that those values are
> the raw values in the file.
>
> Do you have any idea why GDAL would be reporting different min/max values
> for the bands? Any ideas on how I can debug it?
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/gdal-dev/attachments/20150316/c3ea5d14/attachment-0001.html>


More information about the gdal-dev mailing list