[postgis-users] Tuning PostgreSQL for very large database
René Fournier
m5 at renefournier.com
Sun Nov 6 08:51:37 PST 2011
Just wondering what I can do to squeeze out more performance of my database application? Here's my configuration:
- Mac mini server
- Core i7 quad-core at 2GHz
- 16GB memory
- Dedicated fast SSD (two SSDs in the server)
- Mac OS X 10.7.2 (*not* using OS X Server)
- PostgreSQL 9.05
- PostGIS 1.5.3
- Tiger Geocoder 2010 database (from build scripts from http://svn.osgeo.org/postgis/trunk/extras/tiger_geocoder/tiger_2010/)
- Database size: ~90GB
I should say, this box does more than PostgreSQL geocoding/reverse-geocoding, so reasonably only half of the memory should be allotted to PostgreSQL.
Coming from MySQL, I would normally play with the my.cnf, using my-huge.cnf as a start. But I'm new to PostgreSQL and PostGIS (w/ a big database), so I was wondering if anyone had suggestions on tuning parameters (also, which files, etc.) Thanks!
…Rene
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/postgis-users/attachments/20111106/b23157c5/attachment.html>
More information about the postgis-users
mailing list