[GRASSLIST:3498] Re: Geophysical/Potential field modules for GRASS?

Benjamin Ducke benducke at compuserve.de
Wed May 26 14:45:15 EDT 2004


Thanks for all the hints, Michael

Although I have been using GRASS for several years now,
I still am not aware of all the capabilities it has.
I am especially thankful for pointing out the possibility
to user GRASS' rubber sheeting capabilites for geo-referencing
the data. I had completely forgotten all about that functionality...
We should make sure that we make as much use as possible
of what's already in place.
I completely agree about the importance of volumetric representations
of data, especially concerning archaeological stratigraphic reasoning.
Unfortunately, I have never dealt much with this form of data
representation and have no experience.
Do you have any specific thoughts on how to integrate volumetric 
measurements into a geophysics infrastructure?

If everyone agrees, I would like to get started on summing up
the discussed items so far and creating a first draft white paper.
I have a heavy workload right now but am pretty confident
that I would be able to finish it over the weekend and mail
it around at the start of next week.
Good enough?

Cheers,

Benjamin
 

On Tue, 25 May 2004 16:10:10 -0700
Michael Barton <michael.barton at asu.edu> wrote:

> Benjamin,
> 
> Certainly a module or set of modules to make this easier would be nice. 
> And I certainly don't want to discourage you from doing this. I just 
> wanted to point out that almost all of this is already built into 
> GRASS. You can make a slick application by using existing commands in a 
> shell script rather than writing them again from the ground up in C++. 
> Using TclTk, for example, scripts can be very sophisticated. The 
> display manager is a script. An alternative is to write an external 
> program that is geared toward geophysical survey using grasslib, 
> allowing you to use grass commands in your application.
> 
> I've actually done exactly what you describe last fall on one of my 
> sites in Spain for Cesium magnetometry data. All of it was done in 
> GRASS except calculating the xy coordinates of the data points (I did 
> this in Excel). Of course, I did each step one at a time. You could 
> speed this up by automating the data flow from one module to another. 
> The creation of xy coordinates for a stream of data points along a 
> transect would take new coding (the part I did in Excel). If you 
> incorporated a metagrid reference in this translation module so that it 
> assigns x&y correctly, the points for each survey grid would 
> automatically be referenced into your mesh.
> 
> With regularly spaced data points you could use fast v.surf.idw 
> interpolation (or r.binlinear) to create the grid. However, the 
> distance between data points is often much less than the distance 
> between transects. In this case, it might be necessary to use some of 
> the more sophisticated options of v.surf.rst. If another interpolation 
> routine works better for you (e.g., some form of Krieging), you would 
> need to create a new module (this would be nice for a variety of 
> things). r.patch will put the maps together.
> 
> You can subset using g.region (specify extents) and/or masks (each grid 
> being a potential mask) for analysis. Since GRASS and most GIS programs 
> always create new maps from raster analysis routines, rollback is never 
> a problem, though accumulating files is. g.mlist will give you batch 
> lists of you analysis files if you name them in a consistent fashion so 
> your application can track what has been done are display the results 
> of different filtering sessions.
> 
> You can create a set of reference points (ground control points), save 
> them in the format used by i.rectify. Again, you could script or 
> program this to make it easier. For a rectilinear grid, you only need 
> 3-4 for a 1st order transformation. If the grids are patched, this will 
> georeference and rectify the entire set to whatever coordinate system 
> you want. Alternatively, you can do this outside GRASS with gdalwarp if 
> you have gdal on your system.
> 
> Clearly, if you do this a lot, it would be very handy to automate 
> and/or enhance these various routines in a systematic fashion. However, 
> you don't have to start from scratch, but can use tools already built 
> into GRASS to give you more bang for your buck so to speak.
> 
> One thing not mentioned is the use of true 3D volumetric modeling. This 
> is an area that definitely COULD use new modules programed. It seems 
> highly appropriate for archaeology and geological applications where we 
> actually deal with volumes of sediment (or rock) rather than surfaces. 
> GPR and coring data are naturals for this, but GRASS lacks much in the 
> way of analysis or query ability for G3D data--only a map calculator, 
> though this gives something to start with for someone who wanted to 
> work with it.
> 
> These are just some thoughts. I'd love to see more ways to use this for 
> archaeology and will be very interested in where you go with this. I 
> want to encourage you to work with this and hope you will keep me in 
> the loop. Thanks.
> 
> 
> Michael
> ______________________________
> Michael Barton, Professor & Curator
> School of Human Origins, Cultures, & Societies
> Arizona State University
> Tempe, AZ  85287-2402
> USA
> 
> voice: 480-965-6262; fax: 480-965-7671
> www: http://www.public.asu.edu/~cmbarton
> On May 25, 2004, at 2:34 PM, Benjamin Ducke wrote:
> 
> > Hmmm, I think these are somewhat different ideas about
> > the functional level of working with geophysical data in GRASS.
> > My idea (and I think also Craig's) was to actually have a high-level
> > infrastructure that makes working with a lot of measurements and
> > filter setups a breeze.
> >
> > by (quote myself)
> >>>   mesh of adjacent, regular measurement grids with dimension,
> >>>   orientation and origin in reference to the working location
> >
> > I was referring to this:
> > When I take gradiometer measurements in the field, I first superimpose
> > a mesh of adjacent, regular and equal-sized grids on the site.
> > Each of these grids might be, say 20x20 m.
> > I complete a series of measurements by walking zig-zag or parallel
> > lines in this grid. The grid gets saved as an ASCII-list of raw
> > measurement data (or transferred directly to GRASS via a serial
> > device driver).
> > Now, after I get home from the field, I have these things on my
> > agenda:
> >
> > - convert each grid's ASCII data to a raster map
> >   if I took a measurement each 10 cm, each grid would
> >   result in a 200 x 200 cell raster map
> > - assemble the entire mesh from all the grids
> >   and georeference it to my other site data
> > - apply filters to all or a subset of the grids
> >   try out different filter combinations, roll-back
> >   to original data if I messed up
> > - create color ramp(s)
> > - output the final image
> >
> > Now, if there was a place to store each grid's position
> > in a mesh meta structure, plus some additional information,
> > I could easily do the following
> > things:
> >
> > - read raw data into one of the grids and turn it
> >  into a raster map at, say, position 3 (instead
> >  of having to create a raster, than editing its
> >  header to move it to the precise position I want it)
> >
> > - run a number of filters on all or a selection
> >   of grids (say low pass filter over 1,2 and 6)
> >
> > - keep several sets of processing lists with different
> >   filters and options and switch between them
> >   to quickly compare results.
> >
> > - move the whole bunch of grids around and rotate
> >   them into the right position, by specifying
> >   coordinates of the MESH instead of every single
> >   grid -- and keep the overall structure intact
> >
> > +-+-+-+
> > |1|2|3|
> > +-+-+-+
> > |5|6|7|
> > +-+-+-+
> >
> > Say -- wouldn't that save a lot of work and be
> > so much more fun than batch scripting?
> >
> >
> > Cheers,
> >
> > Benjamin
> 




More information about the grass-user mailing list