[gdal-dev] running gdal_merge for a lot of files

Frank Warmerdam warmerdam at pobox.com
Fri Jun 26 09:33:33 EDT 2009


Pavel Iacovlev wrote:
> Good day all,
> 
> I am trying to merge a large data set, the input imagery is 1TB RGB
> uncompressed tifs (lots of them). Then I run gdal_merge.py on a small
> data set it runs ok and with -v option I can see whats happening. But
> then I run on a large dataset I don't get no verbose output (I don't
> get no output at all) and it stuck on 8.9 GB image size. I can just
> track that the image is getting bigger but not verbose output
> whatsoever and I can tell why it stops on 8.9 GB.
> 
> My command is: gdal_merge.py -v -init 255 -of GTiff -co TILED=YES -co
> COMPRESS=JPEG -co PHOTOMETRIC=YCBCR -co BIGTIFF=YES -o of.tif of/*.tif
> 
> The input images are around 160mb each so loading them 1 by 1 into
> memory is not a problem.

Pavel,

I don't know what is going wrong.  I would suggest adding lots of print
statements in the gdal_merge.py script to try and establish where it is
freezing up.

Best regards,
-- 
---------------------------------------+--------------------------------------
I set the clouds in motion - turn up   | Frank Warmerdam, warmerdam at pobox.com
light and sound - activate the windows | http://pobox.com/~warmerdam
and watch the world go round - Rush    | Geospatial Programmer for Rent



More information about the gdal-dev mailing list