[postgis-users] large postgres/postgis databases

Gregory S. Williamson gsw at globexplorer.com
Mon Aug 11 18:15:46 PDT 2003


Not sure if this counts as large, but we have a table with 39177725 rows of data ... I built an index on it in less than an hour on an Intel box, 1 gighertz, 2 CPUs, 2 gig of RAM running postgres 7.3.3 with the GIS extension (Linux OS).

It took somewhat less time to do the ANALYZE (presumably because the process doesn't have to look at every row) -- perhaps half an hour (I don't have exact numbers at hand) but I can rerun the processes and see. 

We loaded a database that is essentially an amplified version of the TIGER data set (the above is the number of rows in the "streets" table) -- it was built by a script and we didn't keep of time but less than a day to all the tables, index and analyze.

We are still studying performance ... I will post more as I find it out.

HTH,

Greg Williamson
DBA
GlobeXplorer LLC

-----Original Message-----
From: Chris Faulkner [mailto:chrisf at oramap.com]
Sent: Monday, August 11, 2003 1:08 PM
To: postgis-users at postgis.refractions.net
Subject: [postgis-users] large postgres/postgis databases


Hello

I have just started to look at PostGIS as a means of managing my vector
data.

I am interested to know the volumes of data that people have created and
indexed in their geometry fields. The manual mentions that a table of 1
million rows has been indexed on a Solaris machine and that it took an hour.
Are there any limits to the use of the GiST index ? Would it work on 10
million rows, for example ? If I had such a large table, what sort of spec
machine would I need to build the index -- does the operation require a lot
of memory ? Any examples of the use on larger databases would be very
useful.

Thanks all

Chris



_______________________________________________
postgis-users mailing list
postgis-users at postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users



More information about the postgis-users mailing list