[gdal-dev] Re: running gdal_merge for a lot of files
Christian Müller
christian.mueller at nvoe.at
Mon Jun 29 22:36:25 EDT 2009
I had the problem when running the gdal_mergey.py with many tiles using the
* parameter.
But using the --optfile options resolves this for me.
I had also problems constructing big images from many tiles, I did a
partitioning on the the tile set and merged in an iterative manner.
WolfgangZ writes:
> Frank Warmerdam schrieb:
>> Pavel Iacovlev wrote:
>>> Good day all,
>>>
>>> I am trying to merge a large data set, the input imagery is 1TB RGB
>>> uncompressed tifs (lots of them). Then I run gdal_merge.py on a small
>>> data set it runs ok and with -v option I can see whats happening. But
>>> then I run on a large dataset I don't get no verbose output (I don't
>>> get no output at all) and it stuck on 8.9 GB image size. I can just
>>> track that the image is getting bigger but not verbose output
>>> whatsoever and I can tell why it stops on 8.9 GB.
>>>
>>> My command is: gdal_merge.py -v -init 255 -of GTiff -co TILED=YES -co
>>> COMPRESS=JPEG -co PHOTOMETRIC=YCBCR -co BIGTIFF=YES -o of.tif of/*.tif
>>>
>>> The input images are around 160mb each so loading them 1 by 1 into
>>> memory is not a problem.
>>
>> Pavel,
>>
>> I don't know what is going wrong. I would suggest adding lots of print
>> statements in the gdal_merge.py script to try and establish where it is
>> freezing up.
>>
>> Best regards,
>
> wasn't there once a problem with the number of files that are joined? Or
> was this only when adding the files manually at the command line (not
> using the * operator).
>
> Wolf
>
> _______________________________________________
> gdal-dev mailing list
> gdal-dev at lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/gdal-dev
More information about the gdal-dev
mailing list