[Gdal-dev] Re: Rendering optimization [Re: Looking for huge vector dataset]

Mateusz Loskot mateusz at loskot.net
Fri Mar 9 01:27:37 EST 2007

Schuyler Erle wrote:
> * On  6-Mar-2007 at 11:55PM PST, Mateusz Loskot said:
>>>> 1. load the whole geometry data into memory, into simple
>>>> structures  (arrays) - this is similar to how Manifold works
>>> This is doomed to lead into problems once you try to open a theme
>>> with 5 million polygons...
>> Yup, in many cases it is.  However, both approaches introduce some
>> problems.  In all-in-memory approach, the obvious assumption is a
>> user has a *lot* of RAM space, 4-8 GB.  But in turn, it provides the
>> fastest access to features (accessing RAM is faster than disk or
>> network).
> This is sort of a blue sky idea, but we build overviews of rasters for
> exactly this sort of purpose. I can easily imagine designing some kind
> of OGR-based tool that constructs a directory of progressively
> generalized versions of a master vector dataset according to a
> hierarchy of scales, and then offers an access API that selects the
> generalization for a scale appropriate to the task at hand. I can't
> speculate whether this sort of tool would belong in GDAL/OGR the same
> way that gdaladdo etc. does.


The idea is very interesting, but I'm pretty sure this kind of
strategies are application specific, so better they are implemented
outside OGR. Just my opinion.

Mateusz Loskot

More information about the Gdal-dev mailing list