[GRASS-user] v.clean process killed itselt!?

Glynn Clements glynn at gclements.plus.com
Sun Jan 11 07:36:46 EST 2009


Markus Neteler wrote:

> Would it be possible to develop a (rough) formula to estimate
> the memory need? With Thomas Huld we did so for the new
> r.sun and it's quite useful to pick the right computer before
> launching a multiple day job (in case you have a choice of
> course).

The problem here is that it may vary with the data. I still don't
understand the algorithm all that well.

> > Add onto that the additional data used for e.g. the current boundary
> > (which could be most of the map if it's a long, detailed stretch of
> > intricate coastline), new vertices created during snapping, other
> > housekeeping data etc and it could easily exceed RAM.
> 
> Is this along the lines of the suggestion to break long lines?
> 
> http://trac.osgeo.org/grass/browser/grass/trunk/doc/vector/TODO#L242
> 242 	v.in.ogr
> 243 	--------
> 244 	It would be useful to split long boundaries to smaller
> 245 	pieces. Otherwise cleaning process can become very slow because
> 246 	bounding box of long boundaries can overlap large part of the map (for
> 247 	example outline around all areas) and cleaning process is checking
> 248 	intersection with all boundaries falling in the bounding box.
> 
> I wonder how hard that is to implement (since we have the
> v.split algorithm).

This would make matters worse for snapping, as it will increase the
number of vertices.

-- 
Glynn Clements <glynn at gclements.plus.com>


More information about the grass-user mailing list