[Gdal-dev] Re: Rendering optimization [Re: Looking for huge
vector dataset]
Schuyler Erle
schuyler at nocat.net
Thu Mar 8 00:01:34 EST 2007
* On 6-Mar-2007 at 11:55PM PST, Mateusz Loskot said:
> >>
> >> 1. load the whole geometry data into memory, into simple
> >> structures (arrays) - this is similar to how Manifold works
> >
> > This is doomed to lead into problems once you try to open a theme
> > with 5 million polygons...
>
> Yup, in many cases it is. However, both approaches introduce some
> problems. In all-in-memory approach, the obvious assumption is a
> user has a *lot* of RAM space, 4-8 GB. But in turn, it provides the
> fastest access to features (accessing RAM is faster than disk or
> network).
This is sort of a blue sky idea, but we build overviews of rasters for
exactly this sort of purpose. I can easily imagine designing some kind
of OGR-based tool that constructs a directory of progressively
generalized versions of a master vector dataset according to a
hierarchy of scales, and then offers an access API that selects the
generalization for a scale appropriate to the task at hand. I can't
speculate whether this sort of tool would belong in GDAL/OGR the same
way that gdaladdo etc. does.
SDE
More information about the Gdal-dev
mailing list