[GRASS-user] v.clean process killed itselt!?

Nikos Alexandris nikos.alexandris at felis.uni-freiburg.de
Sat Jan 10 03:31:47 EST 2009


On Sat, 2009-01-10 at 08:07 +0000, Glynn Clements wrote:
> Markus Neteler wrote:
> > >> > On a 64-bit system with a lot of RAM. A 32-bit system limits each
> > >> > process to a 4GiB address space, some of which is reserved.

> > >> My system is 64-bit, CoreDuo 2,53GHz, and I run Ubuntu Intrepid 64-bit.
> > >> Unfortunately I can't install more RAM.
> > >>
> > >> Will more swap space help?
> > >
> > > I doubt it.
> > 
> > I wonder if we have a memory leak in the vector library.
> > Compare
> >  http://trac.osgeo.org/grass/ticket/14
> > 
> > If it is there bit small normal usage won't trigger it in a significant
> > way. But this huge map would do.
> 
> Possibly. But with a map this large, you don't need a leak. The raw
> data will barely fit into memory, and any per-vertex, per-edge etc
> data could easily push it over the limit.
> 
> AFAICT from the output and the code, it's dying in Vect_snap_lines(). 
> 
> Looking into it more, I don't think that it's a leak; I just think
> that it's trying to store an "expanded" (i.e. bloated) version of a
> 2.7GiB map in RAM on a system which only has 4GiB.
> 
> E.g. for each line vertex, it stores a bounding rectangle (actually a
> cube, 6 doubles, 48 bytes). If there are 122 million vertices and only
> ~2 million are centroids, that could be 120 million line segments,
> which would be ~5.4GiB.
> 
> Then there's the vertices themselves, and it's storing a significant
> fraction of those at 2*8+4 = 20 bytes each, which could consume
> anything up to 2.4GiB (the extra 4 bytes per vertex accounts for the
> difference to the size of the "coor" file).
> 
> Add onto that the additional data used for e.g. the current boundary
> (which could be most of the map if it's a long, detailed stretch of
> intricate coastline), new vertices created during snapping, other
> housekeeping data etc and it could easily exceed RAM.
> 

In other words one needs a powerful *workhorse* to work with very big
maps. May I add in the osgeo-wiki [1] something like:

Very big vector maps (raster?) (>2GB?) require maximal amount of RAM
(>=6GB?).

---
[1] http://wiki.osgeo.org/wiki/GIS_workstation_setup_tips



More information about the grass-user mailing list