[Qgis-user] Batch query web database

Alister Hood alister.hood at synergine.com
Thu Jul 14 15:20:54 PDT 2011


> Date: Wed, 13 Jul 2011 00:19:16 -0700
> From: Alex Mandel <tech_dev at wildintellect.com>
> Subject: Re: [Qgis-user] Batch query web database
> Cc: qgis-user <qgis-user at lists.osgeo.org>
> Message-ID: <4E1D46F4.8040907 at wildintellect.com>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> You could write a python script which hits the webpage with urllib (I
> think that's the name) and then takes the return page and extracts out
> the data you want saving down as a csv that can be joined to the table
> or if a QGIS plugin put directly into the table.
> 
> This is known as screen scraping and is a time honored way of getting
> around the fact that a website doesn't give you what you need. If
you're
> worried about them noticing you can put a sleep timer to only make
> request x times per second (assuming you want to do it all in batch).
> 
> I have done this before with the Google/Yahoo geocoders which just
limit
> how many request per day.
> 
> Enjoy,
> Alex

If you don't in fact need to get all the data in a table (i.e. you only
need to be able to look up the information for a particular site when
you need it), you could create an "action" to download and display the
information.  This way you would always be dealing with up-to-date
information, so you wouldn't need to redo the screen scraping
periodically to update the table.

Alister



More information about the Qgis-user mailing list