<div dir="ltr">Trent, yes, this was key info.<div><br></div><div>When I open the Catalog panel in ArcMap and tell it to compute statistics for the file, it comes back with the same statistics as GDAL gives. So that part of the mystery is resolved.</div><div><br></div><div>What is confusing is that, for this particular image, whatever ArcMap is doing to compute those quick statistics (the ones where, for instance, the blue ranges from 208 to 1326 instead of the actual 0 to 1991), works out well in terms of actually displaying the image. That is, if I map each band's true value range into 0-255 for display, I get a picture that is strongly tinted green. But if I use the quick ranges, that brings up the red and blue (because their quick max is so much lower than the 2047) and the picture ends up looking correct/realistic.</div><div><br></div><div>So, maybe this is just a coincidence, and in this particular case what ArcMap happens to be doing for its quick statistics turns out to give about the same effect as a (0.25%) percentage clip, which also gives a realistic looking image.</div><div><br></div><div>What I would *like* is a quick way to find those reasonable percentage-clip cutoffs. But for now I'll assume this was just a coincidence and I'll do a scan of all the data to try to determine the (0.25%) percentage-clip cutoffs as quickly as I can.</div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Mar 16, 2015 at 1:57 PM, <a href="mailto:thare@usgs.gov">thare@usgs.gov</a> <span dir="ltr"><<a href="mailto:thare@usgs.gov" target="_blank">thare@usgs.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="font-family:'Calibri','sans-serif'">This might not matter in your case but in current versions of Arcmap it is common to only estimate the stats. I now always calculate stats per image prior to loading into Arcmap (in catalog). <div><br></div><div>Let me know if that helps, </div><div>Trent </div><br><div>----- Reply message -----<br>From: "Michael Katz - NOAA Affiliate" <<a href="mailto:michael.katz@noaa.gov" target="_blank">michael.katz@noaa.gov</a>><br>To: "gdal-dev" <<a href="mailto:gdal-dev@lists.osgeo.org" target="_blank">gdal-dev@lists.osgeo.org</a>><br>Subject: [gdal-dev] poBand->ComputeStatistics() returning different values than ArcMap for .tif file<br>Date: Mon, Mar 16, 2015 2:40 PM</div></div><br><div dir="ltr">I have a 2GB .tif file. It is 16 bit, and I know the values range from 0 to 2047 (i.e., 11 bits are used in the values). It reports three bands, RGB, and when I use poBand->ComputeStatistics() on each band I get:<div><br></div><div>red band (first): min: 0, max: 1953</div><div>green band (second): min: 0, max: 2047</div><div></div><div>blue band (third): min: 0, max: 1991</div><div></div><div><br></div><div>But when I open the same file in ArcMap, get the Layer Properties, look at the Symbology tab, set the stretch type to Minimum-Maximum, uncheck Apply Gamma Stretch, and click the Histograms button, it shows the following values:</div><div><br></div><div><div>Red: min: 60, max: 1547</div><div>Green: min: 230, max: 2047</div><div></div><div>Blue: min: 208, max: 1326</div><div><br></div><div>It is behaving as if (perhaps) ArcMap is throwing out a certain percentage of outlier low and high values when reporting the statistics. But I have Minimum-Maximum set as the stretch type, not Percentage Clip, and in any event, my understanding is that the chosen stretch type should not affect the statistics shown on the Histograms dialog box; that those values are the raw values in the file.</div><div><br></div><div>Do you have any idea why GDAL would be reporting different min/max values for the bands? Any ideas on how I can debug it?</div><div><br></div><div></div></div></div>
</blockquote></div><br></div>