[GRASSLIST:3563] Re: Geophysical framework: WHITE PAPER

Benjamin Ducke benducke at compuserve.de
Tue Jun 1 12:07:03 EDT 2004


Nice summary. I wasn't aware there were so many ways/needs
to correct data ...
As regards implementation:
Corrections are just filters from a technical point of view,
so they could fit into the framework just like filters.
But, from a conceptual point of view, they seem to be different,
because:

	(a) data does not always make sense w/o correction
	(b) corrections should always be applied before any filters.

So some differentiation in the workflow seems to be appropriate.
I suggest to implement corrections just like filters, i.e. with
the same set of filter arguments + anything additional that is
needed.
However, in the filter list, corrections should be marked with
a special XML-tag, so that they can be identified.
In the GUI for setting up filters, we could then have two list
views: one for corrections to apply on top, and one for
regular filters below.
For calculation of the result raster, the r.composite module
(or whatever it is going to be called in the future) would
then be able to make sure that corrections always run prior
to filtering.
This should make things transparent for the user.

What do you think?

Benjamin


On Tue, 1 Jun 2004 09:28:41 -0600
Craig Funk <funkmeister at lynxseismicdata.com> wrote:

> Thanks Michael and Benjamin, I have several comments/questions:
> 
> Acquisition/Raw data:
> <quote>
> If the data are a set of vector points recorded in an orthogonal grid, 
> they can be georeferenced using ***v.transform*** and a simple first 
> order transformation. This requires only that the real-world 
> coordinates (e.g. UTM's) be known for 3-4 of the original data points 
> (i.e., ground control points or GCP's). The corners of the original 
> survey area are convenient places to do this, but any well-spaced set 
> of points will do in theory.
> </quote>
> 
> In my experience, the data is rarely uniform. For mineral exploration, 
> data is often collected in less then hospitable terrain which means 
> that certain points on the proposed acquisition grid cannot be accessed 
> in the real world. For the surveys I have been involved in where the 
> instrument does not have a GPS, a coordinate was obtained using a 
> technique appropriate for the accuracy desired ie GPS instrument or 
> using laser Theodolites. Even in this case though, the data could still 
> be imported as vectors as you described after some manipulation:
> 
> x-coordinate, y-coordinate, other data (ID, values, dates, etc)
> 
> Regarding corrections that need to be performed on the data (also often 
> referred to as reductions):
> 
> 1) Drift correction. Some instruments like gravimeters or magnetometers 
> tend to drift over time and this correction needs to be applied to raw 
> data. The drift can result from numerous factors such as moon/celestial 
> tides, magnetic storms, atmospheric conditions, or mechnical/electrical 
> drift in the instrument. The drift correction could be performed on the 
> raw data prior to import into GRASS or on the vector data after import.
> 
> 2) Latitude correction. This is grav data correction accounting for the 
> non-spherical shape of the earth. It is a function of latitude and 
> elevation. I suppose this could probably be performed using r.mapcalc. 
> One would need to be able to obtain a latitude and elevation for a 
> given coordinate pair in order to compute the correction.
> 
> 3) Eötvos correction. Corrects for Coriolis force (due to rotating 
> earth) and a moving gravity instrument (at sea or in the air). 
> Correction depends on velocity latitude and heading of object which is 
> measuring gravity. Again, this could probably be computed in r.mapcalc 
> as long as one can get a latitude for a given location.
> 
> 4) Reduction to the pole. Correction needed for mag data so that 
> anomalies are transformed into what would be observed if the magnetic 
> north pole was at the geographic north pole. Mag north drifts, so this 
> correction is needed to provide data consistency. This correction is 
> pretty straight forward in the frequency domain but is not very 
> reliable at low latitudes. So to do this correction a combination of 
> i.fft and r.mapcalc could be used.
> 
> 5) Free air and Bouger correction. Corrects for the reduction in 
> gravity as one moves away from the center of the earth, and also 
> corrects for the gravitational attraction of the underlying mass WRT 
> some datum. Correction is simple and could be done in r.mapcalc.
> 
> 5) Terrain corrections. There is no industry consensus on the best 
> approach to take for these grav corrections. It is logical to use a DEM 
> as input into this correction but there are many algorithms that could 
> be used. For a fairly accurate grav survey the elevations needs to have 
> a measurement error of 3 cm or less. I am still looking into the best 
> approach - anyone have any recommendations? Regardless, a GRASS module 
> would likely have to be written for this one although simpler 
> approaches could probably be implemented in r.mapcalc.
> 
> 6) Trend surface removal. This applies to grav and mag data. Usually 
> involves fitting a 1st or 2nd order surface to the data and then 
> differencing that surface with the data set to get the residual data. 
> There are many ways to do this. I recently came across a paper from 
> Paolo Zatelli and Andrea Antonello titled "New GRASS modules for 
> multiresolution analysis with wavelets". Multiresolution analysis would 
> be a great way to do this reduction, although a more traditional 
> approach would be to use least squares to fit a 1st or 2nd order 
> surface.
> 
> To summarize: All the standard corrections (except maybe terrain 
> correction) could be applied using existing GRASS modules. These 
> corrections could be implemented either in shell scripts or in a new 
> GRASS module. My inclination is to write a GRASS module to do this. Any 
> comments?
> 
> 
> Craig
> 
> 




More information about the grass-user mailing list