[postgis-users] Re: Out of Memory problem for large table by ST_Contains(..)

Webb Sprague webb.sprague at gmail.com
Fri Oct 24 10:48:48 PDT 2008


> I forgot to mention one point: the point and the polygon tables reside in
> different databases and dblink is used for cross db query. Today, we
> imported the small table into the db where the polygon table is and the
> problem is gone. So it sounds like dblink might be the reason for the
> crashes. Any bug associated with dblink?

dblink is what I think of as a marginal package -- not in the core of
Postgres, with many quirks, and (I believe) behaves strangely if you
link across different versions and libraries.  For anything complex, I
would avoid it.

> On the other hand, the database is not well tuned so that "select count(*)
> from PolyTbl" takes about 100 seconds, which is too slow for a table with
> only 3 million records. Any input on the performance issue of postgres?

THere are LOTS of discussions on why "select count(*) from Table" is
slow in Postgres (it forces a seq scan because Postgres doesn't have
the table metadata to just output the count).  The way to fix is to
keep a table with the count which is updated with a trigger (I think
this is the best -- google for the discussions).  Just because select
count(*) is slow doesn't mean a more specific query will be slow,
however.

You will probably get better answers on the Postgresql general mailing
list to these questions, and lots of those folks are familiar with
Postgis too.



More information about the postgis-users mailing list