[postgis-users] Running large PostGIS databases

Paragon Corporation lr at pcorp.us
Sun May 3 19:31:46 PDT 2009


I was wondering anyone has had any similar experiences and suggestions or
just an idea of what kind of hardware other people are using for large
databases.  May be more of a PostgreSQL question.

We have a fairly large PostgreSQL/PostGIS database.

Its about 500GB, its largest table is an inherited hierarchy of 100 tables.
Running PostgreSQL 8.3/PostGIS 1.3.5/GEOS 3.1.
4-disk RAID 10 configuration, 64-bit Red Hat Linux EL 5, I think 4 multicore
Intel XEON 2.3 GHZ processors, 8GIG RAM

And shall I say undergoes massive updates.  Let us say  10,000
inserts/updates a minute is not uncommon.
Speed has been surprisingly decent (especially given the meager hardware and
the complexity of some of our queries), but we have had some disk corruption
issues and I'm not sure if we are just pushing our limits or if its just
faulty hardware in both failure cases.  

The last server we were on was a 32-bit EL 5 (4 multicore) I think RAID 5
config and the RAID on it failed so we went on to a 64-bit RAID 10.
Right now the RAID seems fine, but there are disk failures reported by fsck
which have caused the server to halt and shutdown.

I'm pretty deficient as far as hardware skills go especially Linux hardware
so any suggestions people have such as appropriate hardware, postgresql
settings would be great and other tricks for troubleshooting and mitigating
these failures.

Thanks,
Regina


 





More information about the postgis-users mailing list