[GRASS-user] High resolution dem

Hamish hamish_b at yahoo.com
Sat Feb 27 21:15:45 EST 2010


[long, but hopefully interesting & check out the link to the new surface
trials wiki page in the middle]


Frank Broniewski wrote:
> I am trying to create a high resolution dem from contour lines Until
> now all my tries where not successful. At first I tried r.surf.contour,
> but since my interpolation region is not rectangular and the contours
> are not evenly distributed (rough terrain), the result was
> unfortunately not usable

really? how did it go wrong? For me it has always worked beautifully,
and I'm working in mountains where the terrain is about as rough as you
can get. approx 16000x16000 cell region size without problems.

> ( but it took around 7 days to compute, that
> alone was already impressive ;) )

yes, the search algorithm is very inefficient for areas with a lot of
flat lands where it has to search for a long time to find the nearest
contour line. (so in that way it actually likes rough terrain)


> My contour map is a combinatin of a national contour line map (5m vert.
> resolution) and contours from SRTM  with 20m vert. resolution.

are you sure that the two are in agreement? do the contour lines cross
or does the hole in the srtm avoid that?

> I created a "hole" in the srtm contours for the national contour map
> and patched both together to avoid large gaps with no height values
> (mostly for r.surf.contour)

um, why not use srtm directly instead of doing srtm -> contours -> dem?
srtm's native form is raster, not contour lines.

... and then patch in the hi-res r.surf.* result in the middle.


> My region is 17.000 x 13.000 cells wide (5m horiz. resolution). So my

that is getting big, but shouldn't be too bad. except perhaps for
v.surf.rst (and r.fillnulls which uses it).

hmmm... I wonder if it would speed up r.fillnulls & save lots of memory
if we added the no-table and no-topology -b and -z flags to the
r.to.vect step in it?


> current approach is to use small regions (2000 x 2000) to calculate
> small subsets of the dem. Because of the algo used by v.surf.rst to
> create the dem the neighboring tiles do have different height values
> calculated at the borders. So it was not possible to just create the
> tiles and patch them together.

I suspect this is making it more complicated than it has to be. Perhaps
try running v.generalize before v.surf.rst?

> My next approach used an overlapping of 20 cells for each tile and a
> moving window average to calculate the mean of the overlapping tiles.
> The result was quite good, but the moving window approach resulted in
> null values where one tile ended and the other started
..
> (similar to the slope and aspect maps, where there is a 1 cell null
> border around the map in comparison to the input dem).

these are caused because it takes into account all the cells around it,
and at the edges some of those are unknown (out of region is equivalent
to NULL).

> Unfortunately I was not able to remove the null values satisfactorily.
> r.fillnulls fails because of the large region, and r.resamp.rst does
> the job not very well. The stripes are still visible, though filled
> with values.

the usual way would be to use one of the r.surf.* or r.resamp* modules,
then use r.patch with the old map in the foreground and the interpolated
map in the background filling in any gaps.

> When calculating a derivate from the dem, like aspect, the errors from
> filling null values are quite obvious.

I take it the vertical jumps between tile edges are rather large?

> So to make my long text short: Is there a technique to combine two or
> more raster dem with (or without overlapping) with good
> transition/intersection (don't know the correct word) between two
> tiles?

usually it would be a matter of making the overlaps big enough, and
then using r.series or something to patch them together taking the
average. I guess it would be better bias away from map edges somehow?
Not sure how to do that beyond cropping a set number of cells with an
inverse r.grow.

but again 13k * 17k cells is not that huge so you shouldn't need to tile.
(except for r.fillnulls/v.surf.rst). maybe r.surf.idw(2) or
r.surf.nnbathy from wiki addons helps here?


> If necessary I can illustrate my efforts by creating a web page or
> similar ...

I am curious to know what the problem with r.surf.contour was. Usually
it does a great job as long as you take a moment to work around the
integer and NULL/0 issues.

I decided to do some tests using different methods on the 1m contour
lines from the NC sample dataset. Results are here:

  http://grass.osgeo.org/wiki/Contour_lines_to_DEM


John Tate wrote:
> and then patched (r.patch) them all together. The patching should
> average out any differences.

r.patch doesn't average, it uses the cell from the first-listed raster
map which contains a non-NULL cell. Later-listed rasters fill in the gaps
until there are no more gaps or no more maps. r.series will merge
overlapping maps using an average though.

> I then cropped out each 1km tile (1000x1000).
> This was done so that the 1km tiles could be combined for specific
> areas by different people (e.g. only a 4kmsq area for academic 'a' or a
> 6kmsq area 2km away for academic 'b').

why not just use g.region for that?


Pablo Carreira:
> I had good results interpolating 5m contours with v.surf.rst, but I
> spent some time adjusting the parameters to get a smooth surface.

v.surf.rst does not like contours very much. The quadtree method
concentrates on areas with high density of points, and on contour lines
you have high density along the isoline but large gaps in between. see
this wiki page for tips on using v.generalize to even things out a bit.
(especially Helena's linked page on it)

  http://grass.osgeo.org/wiki/RST_Spline_Surfaces#Working_with_contour_lines


Nick Cahill:
> Just to refer to a previous question - when I have had to make
> relatively high resolution DEMs (much smaller than yours, only 5000 x
> 3000 cells), I found it most effective to use Arc/Info to create a TIN
> from contour lines and points, then rasterize that and import the
> raster into GRASS, rather than having GRASS interpolate from contours.

both v.surf.rst with generalized contours or r.surf.contour should create
much nicer surfaces than a rasterized TIN. Both visually and hydro-
logically. Sure, it'll take longer but there is a lot more information
in those vertices than a simple linear interpolation between them and
their nearest neighbors can expose.. why not fill in the gaps between
the points with curves instead of a flat surface?

Use whatever works for you, but do examine the results in the above
"Contour_lines_to_DEM" wiki page.


May I suggest to try r.in.xyz -> r.surf.nnbathy? (or nnbathy directly)
Since you are mentioned in the source code thank-yous I'm guessing you
are already familiar with it. :)   The module is available from GRASS
wiki addons.

or for massive point cloud data perhaps try
  -> r.in.xyz
  -> r.mapcalc *= 100000
  -> r.surf.contour
  -> r.mapcalc /= 100000 ?
?

or v.surf.bspline (newly fixed by Markus M)


> I was never able to get the parameters right in *.surf.rst, and
> processing times were very long.

True & true. "auto-tune" has been a wish for a while. it probably would
take a while to run though. maybe the tips on the RST Spline Surfaces
wiki page along with the ones on the main help page can be useful?
The results can be very nice once you get it tuned right though.

> Arc/Info does the job very quickly and effectively, and doesn't end up
> with overshoots and depressions, which were a problem for me.

The Achilles heel of splines is that they do not deal well with hard
corners (base of cliffs, etc). You get a ringing effect,
  http://en.wikipedia.org/wiki/Ringing_artifacts

you can play with the tension to minimize it, or avoid using it in
areas with sharp changes in topography. (or oversample there)

> I wish this were an option in GRASS. I would also like to be able to
> work with other vector-based surface models in GRASS.

Helena has posted some examples recently, maybe she can say a few words
about it.

but really, try r.in.xyz (or v.to.rast) + r.surf.nnbathy should do the
trick.

nnbathy's linear ("alg=l") algorithm creates a rasterized TIN if you
really want one. (and it's lightning fast)



... and the winner is ... (drum roll please) ... r.surf.contour.

At least IMO.
there are some "thatching" artifacts in it, but look closely at the main
river valley running up the center. it is the only method that creates
it correctly. also it's the only one which is able to recreate the road
running along the upper left ridge in the sample dataset.



Jed Frechette wrote:  (2 weeks ago)
> I'm curious why you say TINs are used at the "cost of resolution"? What
> if I have a surface sampled by a dense irregularly spaced point data
> set, e.g. a lidar point cloud. If I create a TIN using a delaunay
> triangulation of those points the resulting model will exactly pass
> through all of the original data points with linear interpolations
> between.

and be limited to a simple linear approximation in between points.

> As a result it resolves all of the detail in the original data.

well, fundamentally a TIN *is* the original data, and no more.
the onus is on what reads/samples/rasterizes the TIN to do the
linear interpolation at run-time.

> In contrast, if I use a raster map to model the surface it will
> inevitably lead to some degree of aggregation as a function of the
> interpolation method I choose.

as long as the aggregation method is statistically sound, that sort
of filtering is often very useful and can help reduce the dataset to
a manageable size. (is it really useful to preserve 50 returns per
sq m?)

> The same is true if I want to incorporate linear features such as
> breaklines into the model.

this is something I am interested in hearing more about. AFAIK there
isn't much in GRASS which can do that. But we can change that.

> Of course, the question of which type of model is more useful for a
> given application is entirely separate.

Absolutely. I'd like to hear what the folks doing hydrological modelling
etc think & which artifacts the care about. What aspects important for
LIDAR?  (yeah, it's just a tool so that's an open question..)
Especially in light of the new MDF r.watershed improvements and new
r.stream.* addon modules.


In theory, for an identical grid of points representing a surface, using
triangles instead of grid boxes will match the "true" natural surface
much better than a q-bert style collection of boxes (I'm not sure it is
best to think about rasters as full-cell blocks, vs. just representing
where the model surface intersects the cell centers, however).  But in
practice a raster surface is going to be vastly more computationally
efficient, both in terms of processing time and data storage. So you can
get a lot more done with them (at the cost of initial interpolation time)
due to the much lower overhead of dealing with them; and by using
matrices you open the door to lots of neat mathematical tools.


The main usefulness of TINs AFAICS are for setting up triangular-mesh
model domains where it is important to have spatial resolution vary by
orders of magnitude across the region. (to place computational focus on
areas of highest interest)  Whether a quad-tree mesh will be more
efficient there or not probably depends on the nature of the model.


regards,
Hamish


More information about the grass-user mailing list