[gdal-dev] Quick dataset close

Even Rouault even.rouault at mines-paris.org
Wed Sep 22 13:24:42 EDT 2010


Christiaan,

you can try creating your TIFF with the SPARSE_OK=YES creation option. This 
will avoid writing unfilled blocks at dataset closing. However, in the normal 
situation where you don't interrupt your dataset creation processus, if you 
don't write all blocks, the resulting file will be sparsed, which might be not 
handled properly by some non-GDAL geotiff readers.

If you want to get the benefit of SPARSE_OK=YES in "normal" mode *and* have a 
fast closing in case of user interruption, you can try the following hack in 
frmts/gtiff/geotiff.cpp

Replace

/* -------------------------------------------------------------------- */
/*      Fill in missing blocks with empty data.                         */
/* -------------------------------------------------------------------- */
    if( bFillEmptyTiles )
    {
        FillEmptyTiles();
        bFillEmptyTiles = FALSE;
    }

by

/* -------------------------------------------------------------------- */
/*      Fill in missing blocks with empty data.                         */
/* -------------------------------------------------------------------- */
    if( bFillEmptyTiles && !CSLTestBoolean("GTIFF_FAST_CLOSING", "NO") )
    {
        FillEmptyTiles();
        bFillEmptyTiles = FALSE;
    }

In the case of user interruption, do something like :

CPLSetConfigOption("GTIFF_FAST_CLOSING", "YES");
GDALClose(hDS);
CPLSetConfigOption("GTIFF_FAST_CLOSING", "NO");

The approach suggested by Frank is complementary (it is about avoiding writing 
cached blocks), but I don't think that this accounts for most of the time 
spent in the case of a 5GB file, which is likely spent in FillEmptyTiles().

Even

Le mercredi 22 septembre 2010 16:02:45, Christiaan Janssen a écrit :
> Is there a way to quickly close a large dataset using the API? For
> instance, I'm writing to very large TIFF file at which point the user
> cancels, the code in turn stops what it's doing, closes the dataset, and
> deletes the file. The problem is, since the file is around 5GB the close
> dataset function will flush out the remaining file since it believes I'm
> done writing, where in reality I would prefer it to just close it, leaving
> it in an invalid state, so I can promptly delete it (or GDAL could close
> and delete it itself if it wanted doesn't matter to me). Thanks.
> 
> Christiaan


More information about the gdal-dev mailing list