[gdal-dev] Re: running gdal_merge for a lot of files

WolfgangZ wollez at gmx.net
Mon Jun 29 15:32:40 EDT 2009


Frank Warmerdam schrieb:
> Pavel Iacovlev wrote:
>> Good day all,
>>
>> I am trying to merge a large data set, the input imagery is 1TB RGB
>> uncompressed tifs (lots of them). Then I run gdal_merge.py on a small
>> data set it runs ok and with -v option I can see whats happening. But
>> then I run on a large dataset I don't get no verbose output (I don't
>> get no output at all) and it stuck on 8.9 GB image size. I can just
>> track that the image is getting bigger but not verbose output
>> whatsoever and I can tell why it stops on 8.9 GB.
>>
>> My command is: gdal_merge.py -v -init 255 -of GTiff -co TILED=YES -co
>> COMPRESS=JPEG -co PHOTOMETRIC=YCBCR -co BIGTIFF=YES -o of.tif of/*.tif
>>
>> The input images are around 160mb each so loading them 1 by 1 into
>> memory is not a problem.
> 
> Pavel,
> 
> I don't know what is going wrong.  I would suggest adding lots of print
> statements in the gdal_merge.py script to try and establish where it is
> freezing up.
> 
> Best regards,

wasn't there once a problem with the number of files that are joined? Or 
was this only when adding the files manually at the command line (not 
using the * operator).

Wolf



More information about the gdal-dev mailing list