Best way to structure large datasets
Walter Anderson
wandrson at YAHOO.COM
Mon Jan 9 18:07:21 PST 2006
I have a general question on the best way to structure
large datasets. I am compiling an internal web
application that will server GIS data to a small
workgroup (10-20) but the data set covers the entire
state of Texas. I have a couple of questions for the
experts out there.
1. Specifically the road centerline data is about
1GB in
size (1:24k scale centerlines). The centerline files
are
currently stored as one file per county (254
counties).
I want to use Postgresql to store all of my data but
am
unsure of the whether I can use the tiling index
method with postgresql or even if it is needed.
2. Also I would like to filter the roads system so
that
at the statewide level only the Interstates and US
Highways
are displayed, then as the user zooms in progressively
more
information is displayed. Should / can this be
accomplished
with a single (tiled) vector file or should I
preprocess the
data to generate subsets suitable for each scale
level.
Thanks for any input,
Walter Anderson
Walter Anderson
__________________________________________
Yahoo! DSL Something to write home about.
Just $16.99/mo. or less.
dsl.yahoo.com
More information about the MapServer-users
mailing list