<table cellspacing="0" cellpadding="0" border="0" ><tr><td valign="top" style="font: inherit;"><DIV>I have a Python script that automatically downloads zip files containing large datasets from another server and then unzips the files to further process the data.</DIV>
<DIV> </DIV>
<DIV>It has been used as a geoprocessor of ArcGIS Server.</DIV>
<DIV> </DIV>
<DIV>The script works fine when two datasets each has several kilobytes size, but the script stops half way when datasets were about 11,000KBytes.</DIV>
<DIV> </DIV>
<DIV>I think that the execution time is too long and ArcGIS Server just simply killed the process.</DIV>
<DIV> </DIV>
<DIV>What actions can I try to reduce the execution time?</DIV>
<DIV> </DIV>
<DIV>ArcGIS Server only works on the basis of 32 bits and I was told that the maximum memory it can utilise is 4 MBytes.</DIV>
<DIV> </DIV>
<DIV>I should be grateful if someone can make suggestions/recommendations.</DIV>
<DIV> </DIV>
<DIV>Sincerely,</DIV>
<DIV> </DIV>
<DIV>David</DIV></td></tr></table><br>