[gdal-dev] Possible bug: hard limit to the number of Datasets creatable in a Python process?

Michal Migurski mike at stamen.com
Tue Aug 14 13:05:20 PDT 2012


On Aug 14, 2012, at 1:00 PM, Michal Migurski wrote:

> On Aug 14, 2012, at 12:37 PM, Even Rouault wrote:
> 
>> Calling ds.FlushCache() is not sufficient. In principle, if the dataset handle 
>> goes out of scope, it should be destroyed, so I suspect that it is not the 
>> case. Assigning None to the dataset will force its closing.
> 
> 
> Great, thank you Even.
> 
> I have a place to look now. When I watch the output of lsof as the script runs, I can see that the failure occurs near the moment where the number of open file handles approaches 1024.
> 
> I've added `ds = None` to the relevant parts of my script, but I don't think the file handles are actually being closed.

Ah, found the bug - I was neglecting to close a handle returned from mkstemp. Not a GDAL bug at all, pure Python badness on my part.

-mike.

----------------------------------------------------------------
michal migurski- mike at stamen.com
                 415.558.1610





More information about the gdal-dev mailing list