[gdal-dev] Large NITF file performance problems

Even Rouault even.rouault at mines-paris.org
Tue May 13 14:45:48 EDT 2008


Hi,

I don't really understand why they would have issues with large file sizes 
(below the 4 GB of course)

I've tried the following small python script to generate a 1.2 GB NITF file 
filled with '1' as a value :

#!/usr/bin/env python

import gdal

new_ds = gdal.GetDriverByName( 'NITF' ).Create( 'largentf.ntf', 100000, 12000, 
1 )
new_ds.GetRasterBand(1).Fill(1)
new_ds = None


It runs in less than one minute on my slow machine. I can open the resulting 
file with OpenEV and scroll through it quite smoothly. gdalinfo -checksum 
largetntf.ntf also runs in about 3 minutes, which seems reasonnable.

a gdalinfo on the file shows that it is automatically tiled in blocks of 
256x256 and by looking at the code I can see that 256x256 tiling is 
automatically activated when either the number of lines or columns of the 
file is larger than 9999.

So I don't see any obvious reason why you get poor performance. You could 
probably break with a debugger to see where it idles during gdal_translate ?	

I email you the bzip2 result of the 1.2 GB file that is only 1KB... so you can 
test on the same file as me.

As far as your customer is concerned, maybe there is an issue with compression 
(for example a very large mono-block JPEG image ?). A 'gdalinfo' on the files 
could maybe give some hints ?

Le Tuesday 13 May 2008 18:21:15 Jason Beverage, vous avez écrit :
> Hi all,
>
> I've got a customer who is using large NITF files (~1.5 GB) and is seeing
> ridiculously slow load times in our application.  I don't have access to
> his data, so I can't test directly, but it seems like the NITF driver may
> have some issues with large file sizes (> 1GB).
>
> To test on my end, I created a few different GeoTiff files (600 MB, 800 MB,
> and 1.2 GB) and tried to convert them to NITF using gdal_translate.
> Converting the 600 and 800 MB files worked just fine and had very
> reasonable speed (few minutes).  However, when I tried to use
> gdal_translate on the 1.2 GB file, the process hung at 0% for forever and I
> had to kill it after waiting for a very long time.  It seems as if there is
> something magical about this 1 GB boundary.
>
> Does anyone have any ideas or suggestions as to what could be causing this
> issue?
>
> Thanks!
>
> Jason




More information about the gdal-dev mailing list