[Mapserver-users] Map broker?

woodbri at swoodbridge.com woodbri at swoodbridge.com
Mon Mar 17 05:28:49 PST 2003


There are many of us that have many GBs of data online under a single 
mapserver. The whole US in Tiger/Line or GDT data requires about 8-
20GB of shapefiles. I am working on a project to bring 19-20 TB of 
image data online using mapserver.

The performance has little to do with the total amount of data but 
rather how well you have you data indexed and tiled so mapserver only 
need to look at a small amount of the total data for any given page 
view.

Try running shptree on all your shapefiles.

-Steve W.

On 17 Mar 2003 at 13:55, Krzysztof Chodak wrote:

> Has anyone of You any experience with MapServer and large amount of
> data. By large amount of data I take approximately 300MB of
> shapefiles. I think that MapServer's current architecture is not
> suitable for this use (when taking into account couple hits per
> second).
> 
> I got some knowledge of MapInfo MapXtreme - there is a mechanism of
> map objects broker which is preinitializing map objects and "renting"
> them on request. Don't you think that it would be great to have such
> "map-buffering" mechanism? I'm inspecting MapServer's code to find
> some clues wether it is hard to do it or not...
> 
> Krzysztof Chodak
> 
> _______________________________________________
> Mapserver-users mailing list
> Mapserver-users at lists.gis.umn.edu
> http://lists.gis.umn.edu/mailman/listinfo/mapserver-users
> 





More information about the MapServer-users mailing list