[GRASS-dev] Re: [GRASS-user] problems using r.proj with large data set

Glynn Clements glynn at gclements.plus.com
Sun Dec 17 20:14:47 EST 2006


Glynn Clements wrote:

> > > Is this a problem with large files that I will just have to work around or
> > > is it something to do with my setup?
> > 
> > Propably the same, very old issue:
> > http://intevation.de/rt/webrt?serial_num=241
> 
> I looked into this a while ago. Unfortunately, you can't use rowio (or
> a home-grown equivalent), as libgis doesn't allow the projection to be
> changed while maps are open. So, you have to read the entire input
> map, close it, change the projection, then write the output map.
> 
> To get around the memory issues, you would first need to copy the
> relevant portion of the input map to a temporary file, then use a
> cache backed by that file.

I've added to CVS a modified version of r.proj named r.proj.seg. This
behaves identically to r.proj, but uses a tile cache rather than
reading the entire map into memory.

Currently, each tile is 64 * 64 cells, and the size of the cache is
fixed at 2 * (nx + ny) tiles (where nx * ny is the size of the source
region in tiles), which I would expect to suffice for any realistic
transformation.

[It will definitely suffice for any affine or "near"-affine
transformation. It would be useful to know what level of distortion
needs to be accommodated in practice.]

Cache replacement is random (i.e. when a new tile is loaded, the tile
which it replaces is selected at random). In practice, random
replacement tends to work almost as well as an "optimal" algorithm,
but without the overhead of maintaining usage statistics, as well as
not having any degenerate cases.

Suggestions for improvements and/or performance statistics for
"real-world" usage are welcome.

-- 
Glynn Clements <glynn at gclements.plus.com>




More information about the grass-dev mailing list