[postgis-users] How to estimate the time taken for a query plan without actually running it?
stephenwoodbridge37 at gmail.com
Mon Aug 3 09:32:31 PDT 2020
No, because the run time is dependent on what functions you apply to
each row. For example, if you checking if a point is in a polygon, then
the runtime is dependent on how many points are in each polygon. Really
complex polygons, like state or county boundaries can be very expensive
to evaluate, and may need to be reevaluted for each point. There are
strategies to speed up queries like this, like: tiling the polygon by
chopping it into a grid so there are many smaller and simpler polygons
the the index works much better and and the polygons are simpler so
faster to evaluate.
you need to explain what your data looks like to people have some
concrete idea to make suggestions to speed it up.
On 8/3/2020 10:56 AM, Shaozhong SHI wrote:
> I tried to use "explain" to look into the query plan. It only gives
> me the cost and rows, and does not give an estimate of time to run the
> "Explain analyze" gives an estimate of time needed, but for a Big Data
> query, one has to wait to get an answer,
> Is there a way to get an estimate of time for running a query, without
> actually running it.
> Looking forward to hearing from you.
> postgis-users mailing list
> postgis-users at lists.osgeo.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the postgis-users