[postgis-users] Out of Memory problem for large table by ST_Contains(..)

Shane Butler shane.butler at gmail.com
Tue Oct 21 15:40:23 PDT 2008


Hi John,

I am not sure that this is the exact same problem I was having with
ST_Within(), but I suspect it could be. Hopefully this will be fixed
by some changes that are in the next release. Anyway, for a possible
workaround, see this thread:
http://postgis.refractions.net/pipermail/postgis-users/2008-October/021598.html

Shane


On Wed, Oct 22, 2008 at 9:28 AM, John Zhang <johnzhang06 at gmail.com> wrote:
> Hello list,
>
> I am writing to seek your input on how to handle such an issue:
>
> I have a large table containing over 3 million polygons and a small table
> containing 53k points. My function is to identify whether a point in the
> table is Contained by a polygon in the polygon table. ST_Contains function
> is effectively used for this purpose (it takes about 2 seconds for  given
> known polygon: "polyG && ptG AND ST_Contains(polyG, ptG)" is used where
> polyG an ptG are the geometries of the point an dpolygon). However, with a
> given pt, it crashes to run through all the polygons with the reason given
> "Out of memory for query" in about 300 seconds. I then tested with select
> count(*) of the polygon table, it takes 100 seconds as well. It seems there
> is something wrong in the database configuration. I could not figure out
> what is wrong there. Could anyone help on the issue? Any input would be much
> appreciated.
>
> Thanks in advance.
> John
>
>
> _______________________________________________
> postgis-users mailing list
> postgis-users at postgis.refractions.net
> http://postgis.refractions.net/mailman/listinfo/postgis-users
>
>



More information about the postgis-users mailing list