[postgis-users] Tuning PostgreSQL for a very large database
Bborie Park
bkpark at ucdavis.edu
Mon Nov 7 10:20:01 PST 2011
Rene,
You're best asking the PostgreSQL Performance (pgsql-performance)
mailing list.
http://www.postgresql.org/community/lists/
If you have any spatial-specific questions (a query is running slower
than expected), then definitely post here.
-bborie
On 11/06/2011 10:16 AM, René Fournier wrote:
> Just wondering what I can do to squeeze out more performance of my database application? Here's my configuration:
>
> - Mac mini server
> - Core i7 quad-core at 2GHz
> - 16GB memory
> - Dedicated fast SSD (two SSDs in the server)
> - Mac OS X 10.7.2 (*not* using OS X Server)
> - PostgreSQL 9.05
> - PostGIS 1.5.3
> - Tiger Geocoder 2010 database (from build scripts from http://svn.osgeo.org/postgis/trunk/extras/tiger_geocoder/tiger_2010/)
> - Database size: ~90GB
>
> I should say, this box does more than PostgreSQL geocoding/reverse-geocoding, so reasonably only half of the memory should be allotted to PostgreSQL.
>
> Coming from MySQL, I would normally play with the my.cnf, using my-huge.cnf as a start. But I'm new to PostgreSQL and PostGIS (w/ a big database), so I was wondering if anyone had suggestions on tuning parameters (also, which files, etc.) Thanks!
>
> …Rene
>
>
>
>
>
> _______________________________________________
> postgis-users mailing list
> postgis-users at postgis.refractions.net
> http://postgis.refractions.net/mailman/listinfo/postgis-users
--
Bborie Park
Programmer
Center for Vectorborne Diseases
UC Davis
530-752-8380
bkpark at ucdavis.edu
More information about the postgis-users
mailing list