[GRASS-user] r.contour fails to create contours on large dataset
Hamish
hamish_b at yahoo.com
Sun Mar 10 03:09:22 PDT 2013
Eric wrote:
>>> The command used to run r.contours:
>>> grass.run_command("r.contour", input=lp_subtract,
>>> output=LP_contour, minlevel=0, maxlevel=0, step=10)
Hamish:
>> minlevel and maxlevel are the same?
Eric:
> Yes, only the zero contours are needed. They represent the
> macro topography which are later reinterpolated and subtracted
> from the original DTM.
> It works when testing in a smaller region, so I don't think
> that is the issue...
Hi,
[6.4.3svn]
I just tested with a 40887x54343 region (2221922241 cells > 2^31)
on linux 64bit and it worked ok for me. It wanted 19GB RAM, maybe
yours bailed out if it didn't have enough? (there was 16gb on
the machine I tried it with, it took a little while with disk
swapping, but "only" ~3x slower)
the --verbose flag with r.contour might help give some clues,
as might running r.univar first to make sure of data-in-range.
note if a module making a vector map fails, the part of the
vector map already made stays on disk. (this differs from the
raster maps, where the new map is only copied into the mapset
when the module is mostly done and the map is closed) So you
have to be careful to watch vector-creation modules in case they
fail. 'v.info -t' can help see what data made it into them.
the commands I used in the spearfish demo dataset location were:
(artificial elevation from the r.surf.volcano addon module)
g.region -d
g.region res=3.5 -pa
r.surf.volcano out=lfs_volcano meth=gaussian kurtosis=1
r.mapcalc "lfs_volcano.fcell = float(lfs_volcano)"
g.remove lfs_volcano
time r.contour in=lfs_volcano.fcell out=lfs_volcano_100m_hires step=100 --v
I notice some counters in the source code are "int" and not
"off_t", maybe that matters?
Hamish
More information about the grass-user
mailing list