[postgis-users] Choosing a database scheme for huge amount of points
Alexander Cohagen
alexmolosh at gmail.com
Wed May 27 09:01:55 PDT 2009
Hi,
I have the following data in the system:
1. Points - retrieved from users GPS devices every 10 seconds. (x,y,z -
lon,lat,alt)
2. timestamp for each point
So my amount could potentially grow to:
1 day, 1 user = 8640 coordinates
1 day, 10 000 users = 8 640 000 coordinates
A year, 10 000 users = 3 153 600 000
How could i store such amount of data using postgis in the most effective
way?
Breaking into several little qustions:
1. Should i store all points in 1 table or in several tables each for exact
time period(all inheriting 1 main table with no data)?
2. Should i create the GIST index and make VACUUM ANALYZE?
3. What other performance tips could be helpful? (if you point me to
information about clustering tables or computers with postgis, to best
practises of other people handling such huge amount of data - i appreciate
it).
Anticipating some questions:
1. Yes, I can afford a computing cloud or own computer cluster if needed(But
I dont know how to cluster the postegre)
2. Yes, there is some more data for each coordinate - user_id, timestamp.
3. Yes, there will be enough writting and quiering this database. The
queries will extract points with distance not more than X from a given point
for a given user_id and timestamp.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/postgis-users/attachments/20090527/5f5b5274/attachment.html>
More information about the postgis-users
mailing list