[postgis-users] Large Databases

Martin Daly Martin.Daly at cadcorp.com
Mon Jun 12 01:58:52 PDT 2006


Graham,

> We've been using postgres on some relatively small  (~100k row)
> databases and it performs pretty well.
> 
> We're now considering loading detailed streetmap data for the 
> entire US.
> I'm not exactly sure how large this dataset will be, but I 
> wondered what
> experiences people have had and if there are any performance issues we
> need to be aware of.

We (Cadcorp: http://www.cadcorp.com) have a ~200 Gb database (data and
indices) of complete GB (England, Scotland and Wales) national coverage
of Ordnance Survey MasterMap: 425,285,283 features/records.

We did this with a Big Dumb Upload(tm) into a single table, on a
standard (i.e. with no config tuning at all) PostgreSQL 8.0.4/PostGIS
1.0.4 installation on Windows, and are getting excellent results.  The
machine spec is as follows:

Dell PowerEdge 2800 with dual 64-bit Intel(r) Xeon processor 3.2GHz with
2MB L2 cache (HyperThreading disabled)
4 GB RAM
3 x 300 GB 10,000 rpm SCSI disks
Disk 1
	Operating System (Microsoft Windows Server 2003, Standard
Edition)
	Software installations
	OS MasterMap(tm) data (~35 Gb, gzip-ed GML)
Disk 2
	PostgreSQL data directory

Regards,
Martin Daly,
Technical Director,
Cadcorp



More information about the postgis-users mailing list