[gdal-dev] Possible bug: hard limit to the number of Datasets creatable in a Python process?
Even Rouault
even.rouault at mines-paris.org
Tue Aug 14 12:37:13 PDT 2012
Le mardi 14 août 2012 21:00:20, Michal Migurski a écrit :
> I'm experiencing a reproducible bug.
>
> I create new 258x258 pixel datasets like this:
>
> driver = gdal.GetDriverByName('GTiff')
> handle, filename = mkstemp(dir=tmpdir,
> prefix='dem-tools-hillup-data-render-', suffix='.tif') ds =
> driver.Create(filename, width, height, 1, gdal.GDT_Float32)
>
> I also open a number of datasets in the same process using gdal.Open() and
> warp them together with gdal.ReprojectImage().
>
> When I have created 1,020 new datasets / opened 2,041 total datasets, I get
> a None back from driver.Create(). This number is stable, and appears
> unrelated to the specific datasets that I'm creating. Datasets are being
> flushed and falling out of scope, so I believe that they're correctly
> being closed when they should. Do these numbers sound like known, hard
> internal limits to anyone?
I don't think it is a bug in GDAL, but an issue in your script. This sounds
like you hit the default limit of 1024 files opened simultaneously by a Linux
process. So I suspect that your datasets are not properly closed.
Calling ds.FlushCache() is not sufficient. In principle, if the dataset handle
goes out of scope, it should be destroyed, so I suspect that it is not the
case. Assigning None to the dataset will force its closing.
>
> I'm using the 1.6 Python bindings on Ubuntu 10.04 LTS.
>
> -mike.
>
> ----------------------------------------------------------------
> michal migurski- mike at stamen.com
> 415.558.1610
>
>
>
> _______________________________________________
> gdal-dev mailing list
> gdal-dev at lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/gdal-dev
More information about the gdal-dev
mailing list