[Qgis-developer] Trouble with PostGis

walter.nordmann walter.nordmann at web.de
Mon Nov 3 13:28:10 PST 2014


Hi, 

walter.nordmann wrote
> But: it would be much much easier if Qgis would do the Job: "log all
> queries sent to to the server" instead of using external tools to fetch
> that infos. Using pgadmin3, postgresql server logs, pg_stat_activity or
> even wireshark, which is for catching network traffic, is a little bit
> strange.

after *3 days of hard work* i can see the queries, qgis is sending to the
database server.

Nobody told me to 

- compile qqis in debug mode
- add 2 environment variables 

and inspect the resulting log file 

Remember: I'm not a qgis developer, just a simple user having big problems.

I try to describe the real problem:

I'm doing a spatial query to fetch all country borders out of the
openstreetmap table planet_osm_polygon on my local database server. This is
a really big table with about 152 millons geometries (polygons and
multipolygons mixed) and i only need about 220 of them.

Filter in Qgis: /boundary='administrative' and osm_id<0 and admin_level =
'2'/
/boundary/ , /osm_id/ and /admin_level/ are indexed.

that leads to  
which makes sense to me.

After that you can wait for hours. there is no additional output in the
logfile.


in pgadmin3 i can see fetch-requests, but i don't know if data is beeing
submitted from the database-server to qgis.

btw: a timestamp in the logfile would be fine. 2.8 ?

The same query without cursor using psql runs in about 40 seconds, which is
ok for me.


This happens in 2.4 and the new 2.6 in the same way.

I think it's really  a problem with long queries (longer then 20-30
seconds?). any timeout waiting for data?

regards
walter








--
View this message in context: http://osgeo-org.1560.x6.nabble.com/Trouble-with-PostGis-plugin-tp5166669p5171013.html
Sent from the Quantum GIS - Developer mailing list archive at Nabble.com.


More information about the Qgis-developer mailing list