Hello all,<br><br>I've got a python script that reads and extracts elevation subdatasets from an HDF5 raster. There are hundreds of these and I'm running into issues where system memory usage grows increasingly until I get a system "memory could not be read" error. All the subdatasets are 600x600 and result in 1.4MB GeoTiffs. Extracting 6 of these in a row, I get a memory usage as high as 160MB. I believe I made it through about 20 of these before the system choked. <br>
<br>How can I get the memory usage under control? I'm not concerned about a minimal footprint, just one that won't crash python. I've tried various combinations of del statements at the end of the loop and they seem to make no difference. I've also tried removing the Numeric.multiply line just to make sure I wasn't creating any circular references. Not sure what else to do, any advice appreciated.<br>
<br>-Jamie<br><br>On WinXP with FWTools version 2.10.<br><br>Here's a basic rundown of the script:<br><br>###########<br><br>for i in hdf_sub:<br> outfile = i+'.tif'<br><br> indataset = gdal.Open(i)<br> inband = indataset.GetRasterBand(1)<br>
<br> format = "GTiff"<br> outdataset.driver.Create(outfile, inband.XSize, inband.YSize, 1, gdal.GDT_Float32)<br> outband = outdataset.GetRasterBand(1)<br><br> ## Set the geotransform & projection ##<br> ......blah...blah<br>
<br> ## Read data into Numeric array<br> temp_out = inband.ReadAsArray(0, 0, inband.XSize, inband.YSize)<br><br> ## Make all values negative<br> ## Tried without this line, made no difference <br> temp_out = Numeric.multiply(temp_out, -1)<br>
<br> ## The data is stored reversed on the y-axis, reverse and write<br> outband.WriteArray(temp_out[::-1])<br><br>###########<br><br><br>