shapefile optimization for dynamic data
Ben Eisenbraun
bene at KLATSCH.ORG
Mon Mar 27 15:07:17 PST 2006
Hi-
I'm building a Mapserver application with dynamic data coming in every second,
and I'm not sure how to organize the shapefiles for speediest access.
I'm collecting data via a GPS and a sensor that reports a data point once
per second. I'm using Mapserver CGI to generate an overlay onto a map via a
javascript frontend that auto-refreshes every few seconds. The application
has to run on a low-power embedded hardware device (roughly a p2-266), and
I'm running into performance problems once I've collected a few thousand
data points. The Mapserver CGI process tends to consume all the CPU trying
to render the overlays.
Up to now, I've been using shpadd/dbfadd to add the data points to a single
shapefile. I've tried using shptree to index the shapefile, but under my
testing, it doesn't seem to speed up rendering time at all. The largest a
shapefile should ever get to be is about 25,000 data points; I'm not sure if
that's large or small compared to other people's data.
I've thought about breaking up that single shapefile into multiple shapefiles
of a given size, but I'm not sure if that would be a win for this type of
situation.
Given that I'm constantly updating the source shapefile, what are my options
for optimizing it?
Thanks for any tips or pointers.
-ben
--
simplicity is the most difficult thing to secure in this world; it is the
last limit of experience and the last effort of genius. <george sand>
More information about the MapServer-users
mailing list