[GRASSLIST:7192] RE: Gettting off the ground with v.surf.rst

Andrew Danner adanner at cs.duke.edu
Thu Jun 16 12:16:45 EDT 2005


 Are you running in Windows or Unix. I have a similar Linux setup (2.x
GHz/1GB RAM). I don't think there are any 64-bit optimizations. 

 For 300k points, it should take more than 10 minutes. Perhaps bump up
your segmax a bit to 20 or so. I find keeping npmin/segmax around 4-6
gives good results, but I'm no expert on the quality of the output DEM. 

> v.surf.rst input=grid_0793 layer=0 zcolumn=flt1 dmax=24.998895
> dmin=4.999779 elev=elev_0973 zmult=1.0 tension=40. smooth=0.1 segmax=10
> npmin=100

> Warning: ignoring 610 points -- too dense
> (Why are we dropping 610 points? Could these all be in one place?)

If two points are within a distance dmin of eachother, the second point
is ignored. In your case, dmin is half the grid cell cize, so you have
multiple points within a single raster cell. No sense interpolating on
such a fine resolution when you can only give one value to the cell. It
is ok to drop a few points. The number of points will go up if you
increase the cell size or dmin. Your dmin seems to be set well given the
cell size. 

I have a very experimental version that can scale to very large point
sets and seems to work well on LIDAR data sets. It uses the same
interpolation as s.surf.rst, but the quad tree segmentation is
different. It was the only way I could process the entire Neuse 500
million points at 20ft resolution. My method uses a lot of disk space
though. I needed over 400GB for the entire Neuse. If you are interested
in testing out that method, contact me offline for the code, but the
interpolation is still slow, so I suspect if it is taking so long with
just a small number of points, my code won't help much. 


More information about the grass-user mailing list