[Gdal-dev] Re: gdal fails on vrt (virtual raster) file

WolfgangZ wollez at gmx.net
Thu Sep 6 13:25:53 EDT 2007


Frank Warmerdam schrieb:
> Even Rouault wrote:
>> Answering to my self and after a few more investigations, we can read 
>> in jpeglib.h :
>>
>>   /* Limit on memory allocation for this JPEG object.  (Note that this is
>>    * merely advisory, not a guaranteed maximum; it only affects the space
>>    * used for virtual-array buffers.)  May be changed by outer 
>> application
>>    * after creating the JPEG object.
>>    */
>>   long max_memory_to_use;
>>
>>
>> So, an alternative to define JPEGMEM (which seems to do what I thought 
>> it should do in my previous message) or  could be to add the following 
>> lines :
>>     if (getenv("JPEGMEM") == NULL)
>>        poDS->sDInfo.mem->max_memory_to_use = 500 * 1024 * 1024;
>> just after creating the decompression object done by :
>>     jpeg_create_decompress( &(poDS->sDInfo) ) ;
>>
>> This way, we don't need to patch libjpeg and if the user wants to 
>> define it's own value, he's still able to do so.
> 
> Even,
> 
> I think this is a good idea, and I'd appreciate your adding it trunk.  But
> I wonder if jpeg_create_decompress() doesn't already do a bunch of work 
> that
> might start allocating memory / creating this spill file.
> 
>> FYI, on my Ubuntu system, I found out that libjpeg is build by default 
>> with 1GB for DEFAULT_MAX_MEM.
>> I've tried to define JPEGMEM to a small value but couldn't reproduce 
>> Wolfgang's crash. I think it's because, in jmemansi.c, we can see that 
>> the temporary file is created by the C library "tmpfile()" call. On 
>> Linux, I think this file is created in /tmp : doing a "lsof | grep 
>> gdalinfo" with small values of JPEGMEM on a 10000x10000 progressive 
>> jpeg seems to confirm this intuition. I don't know where the MS C 
>> library tries to create the temporary file. The answer to this 
>> question could show a need for a specific patch in libjpeg for FWTool 
>> to create the temporary file in an appropriate directory.
>>
>> To answer one of Wolfgang's question, it may be surprising that 
>> gdalinfo is affected by this issue. But in JPGDataset::Open, we call 
>> jpeg_start_decompress that is really slow when I set a small value for 
>> JPEGMEM. Then we read image size, color space and other information. 
>> The question is : do we really need to do jpeg_start_decompress to get 
>> this values ? Probably, but I we don't, it could speed up gdalinfo.
> 
> I don't know either.  If we don't that's great, lets fix it.
> 
> Best regards,


Hi,

finally I did some further tests on my data:

One of the datasets worked directly on linux (FWTools 1.3.5) but a 
bigger one also fails with a different error (it takes some time to fill 
the swap file and therefore it take about 20 seconds to get the error):
  ./gdal_translate -co COMPRESS=JPEG -co TILED=YES -a_srs EPSG:27582 
/sonstiges/GIS/Frankreich/Frankreich250k/referenz.vrt 
/sonstiges/GIS/Frankreich/Frankreich250k/Frankreich250k.tif
ERROR 4: 
VSIFOpenL(/sonstiges/GIS/Frankreich/Frankreich250k/20/39_20.jpg) failed 
unexpectedly in jpgdataset.cpp

after converting the tiles with gdal_translate -of JPEG in* out* gdal 
fails immediately (less than a second):
GDALOpen failed - 4
VSIFOpenL(/sonstiges/GIS/Frankreich/Frankreich250k/20/39_20a.jpg) failed 
unexpectedly in jpgdataset.cpp

The strange thing is that exactly the same data set works well on 
windows!!! Memory usage is around 200MB.
It suspect that on linux JPEG sources are generally importet completely 
into RAM, but I'm not a programmer just a user of gdal.

(gdalinfo also fails on linux but works on win, filesize is 48600x46720)

I hope these findings help to improve the (already great) software!

Regards
Wolfgang




More information about the Gdal-dev mailing list