[gdal-dev] Grib2 Question

Kurt Schwehr schwehr at gmail.com
Tue Nov 7 14:21:27 PST 2017


Yeah, 1 << ## would have been better.  Sigh.

But really, someone needs to work through the logic of this stuff and do
something that actually works through what is reasonable.  Code that is
intelligent and explains why it is doing is far preferable.  OOM is not
okay in the world I work in, so I tried to pick a large number (basically
at random) that should let almost all cases through until someone took more
time.

allowing a full range number without checking to see if the size is
reasonable is a *major* problem for me.  An OOM is a fatal (as in no error
message, your process is dead) situation in my world, so allowing larger
without checking seems unpleasant.

Perhaps a config option to allow massive files through in the short term?

On Tue, Nov 7, 2017 at 2:15 PM, Roarke Gaskill <roarke.gaskill at weather.com>
wrote:

>
> Wouldn't the the max size be limited by the number of bytes read?  So in
> this case 4 bytes.
>
> http://www.nco.ncep.noaa.gov/pmb/docs/grib2/grib2_sect5.shtml
>
> Looking at netcdf's implementation they treat the value as a 32 bit signed
> int.
>
> https://github.com/Unidata/thredds/blob/5.0.0/grib/src/
> main/java/ucar/nc2/grib/grib2/Grib2SectionDataRepresentation.java#L69
>
> I am dealing with proprietary grib2 files that do break the current limit
> of 33554432.
>
>
>
> On Tue, Nov 7, 2017 at 4:03 PM, Even Rouault <even.rouault at spatialys.com>
> wrote:
>
>> On mardi 7 novembre 2017 13:51:30 CET Kurt Schwehr wrote:
>>
>> > It's possible to cause massive allocations with a tiny corrupted grib
>> file
>>
>> > causing an out-of-memory. I found that case with the llvm ASAN fuzzer.
>> If
>>
>> > you have a specification that gives a more reasoned maximum or a better
>>
>> > overall check, I'm listening. I definitely think the sanity checking can
>>
>> > be improved. Mostly I just try to survive the g2clib code. It doesn't
>>
>> > come with tests and digging through GRIB specs to match up to g2clib
>> source
>>
>> > isn't my favorite thing to do.
>>
>> >
>>
>> > https://github.com/OSGeo/gdal/commit/ae92f7fb8e32381124a3758
>> 8d27b9af695afce2
>>
>> > 0
>>
>>
>>
>> I guess that if Roarke is asking the question he might have a dataset
>> that breaks this limit ? If so, we might consider reverting that change, or
>> making it more robust (which can be very tricky I know. Perhaps some
>> heuristics with the file size ?), or just using it in fuzzing mode and not
>> in production for now. And a pointer to such a dataset would be much
>> appreciated.
>>
>>
>>
>> (By the way: 2<<24 is IMHO an usual way of writing a limit. I confused it
>> with 2^24 initially. So 1 << 25 would perhaps be better. Or just in decimal
>> form as it is completely arbitary and not related to a binary encoding)
>>
>>
>>
>> Even
>>
>>
>>
>> --
>>
>> Spatialys - Geospatial professional services
>>
>> http://www.spatialys.com
>>
>
>
>
> --
> *Roarke **Gaskill  *|Senior Software Engineer
> *e:* roarke.gaskill at weather.com
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps> <http://weather.com/apps>
> <http://weather.com/apps>
>



-- 
--
http://schwehr.org
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/gdal-dev/attachments/20171107/305ec3e5/attachment.html>


More information about the gdal-dev mailing list