[mapserver-users] Mapserver search performance

Varun saraf vsaraf.gmu at gmail.com
Mon Apr 11 03:37:04 EDT 2011


Hello Everyone,

I have programmed a GIS application using Mapserver, Google maps and
Tilecache. The functionality of this GIS application is to extract the
data (from the dbf file) for all features (Points) within a randomly
drawn user shape and doing some statistical operation on that data. I
use an NQUERY mode with MAPSHAPE attribute to get all the data for the
user drawn shape. Mapserver takes aout 5-10 seconds for a small shape
(a couple of square miles) but as the shape gets bigger (hundreds of
square miles), the time taken to fetch all data related to the
features/points lying in the shape grows exponentially (Upto 2 hours
for some shapes). Until now, we were restricting the maximum area a
shape can have but we have to get rid of that. Is there a way to
improve the performance in any way? Will SHPTREE work for this
purpose? The features are currently points only but we may move to
polygons in future. We use the .shp files for the shapes. Is it
advantageous to move to a database instead? If yes, what database
works best?

What I did notice is that for any given request to mapserver, however
large the shape, the CPU utilization never crosses 12%. Can we improve
performance by increasing the RAM or maybe move to a solid state hard
drive? There is also the possibility of moving this application to
Cloud computing. Anything that will improve the performance actually.
Can someone point me in the right direction as to what might be the
current bottleneck?

T current setup is on windows and uses MS4W on an Apache server.

Thanks,
Varun


More information about the mapserver-users mailing list