[gdal-dev] Shortening execution time of Python script

David Shi davidgshi at yahoo.co.uk
Wed Sep 9 06:54:02 EDT 2009

I have a Python script that automatically downloads zip files containing large datasets from another server and then unzips the files to further process the data.
It has been used as a geoprocessor of ArcGIS Server.
The script works fine when two datasets each has several kilobytes size, but the script stops half way when datasets were about 11,000KBytes.
I think that the execution time is too long and ArcGIS Server just simply killed the process.
What actions can I try to reduce the execution time?
ArcGIS Server only works on the basis of 32 bits and I was told that the maximum memory it can utilise is 4 MBytes.
I should be grateful if someone can make suggestions/recommendations.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.osgeo.org/pipermail/gdal-dev/attachments/20090909/09e9f81f/attachment.html

More information about the gdal-dev mailing list