[GRASSLIST:3596] Re: Geophysical framework: WHITE PAPER
Benjamin Ducke
benducke at compuserve.de
Fri Jun 4 14:49:14 EDT 2004
OK, I think we are getting somewhere.
We have a lot of information floating around
in a lot of emails now.
Maybe I should start to write up a first revision
of the white paper and sent it round again
next week.
I will try to incorporate everything that has
been brought up so far.
Cheers,
Benjamin
On Thu, 3 Jun 2004 22:01:42 -0700
Michael Barton <michael.barton at asu.edu> wrote:
> Craig and Benjamin
>
> Your thoughts are much appreciated. This project is really beginning to
> take shape. I have a couple more suggestions from the GRASS
> perspective.
>
> It is easiest to include them below.
>
> >
> > Thanks Michael and Benjamin, I have several comments/questions:
> >
> > Acquisition/Raw data:
> > <quote>
> > If the data are a set of vector points recorded in an orthogonal grid,
> > they can be georeferenced using ***v.transform*** and a simple first
> > order transformation. This requires only that the real-world
> > coordinates (e.g. UTM's) be known for 3-4 of the original data points
> > (i.e., ground control points or GCP's). The corners of the original
> > survey area are convenient places to do this, but any well-spaced set
> > of points will do in theory.
> > </quote>
> >
> > In my experience, the data is rarely uniform. For mineral exploration,
> > data is often collected in less then hospitable terrain which means
> > that certain points on the proposed acquisition grid cannot be
> > accessed in the real world. For the surveys I have been involved in
> > where the instrument does not have a GPS, a coordinate was obtained
> > using a technique appropriate for the accuracy desired ie GPS
> > instrument or using laser Theodolites. Even in this case though, the
> > data could still be imported as vectors as you described after some
> > manipulation:
> >
> > x-coordinate, y-coordinate, other data (ID, values, dates, etc)
>
> Thinking it over, there are at least 3 possible kinds of data sets:
>
> 1. Data with georeferenced xy coordinates, such as those derived from
> intruments with integrated GPS (or post processed to include GPS data).
>
> 2. Data with xy coordinates that are not georeferenced, such as those
> from an arbitrary grid laid out in the field
>
> 3. Data lacking xy coordinates that need to have them assigned.
>
> If the data flow involves input points into vector format, optionally
> assign coordinates if needed, optionally georeference is needed, it
> would accomodate all three.
>
>
>
> >
> > Regarding corrections that need to be performed on the data (also
> > often referred to as reductions):
> >
> > 1) Drift correction.
> > 2) Latitude correction
> > 3) Eötvos correction.
> > 4) Reduction to the pole.
> > 5) Free air and Bouger correction.
> > 5) Terrain corrections.
>
> These all seem conceptually be be errors in measurement that need
> correction. If so, they should be corrected in the original
> measurements (i.e., vector points) prior to surface model generation.
> Georeferencing does not affect these, but would be necessary at least
> to apply terrain correction (i.e., because DEM's are usually
> georeferenced).
>
>
> > 6) Trend surface removal. This applies to grav and mag data. Usually
> > involves fitting a 1st or 2nd order surface to the data and then
> > differencing that surface with the data set to get the residual data.
> > There are many ways to do this. I recently came across a paper from
> > Paolo Zatelli and Andrea Antonello titled "New GRASS modules for
> > multiresolution analysis with wavelets". Multiresolution analysis
> > would be a great way to do this reduction, although a more traditional
> > approach would be to use least squares to fit a 1st or 2nd order
> > surface.
>
> I **think** that this is better done on the surface model than the
> original points because it is a type of signal processing (see below).
>
> >
> > To summarize: All the standard corrections (except maybe terrain
> > correction) could be applied using existing GRASS modules. These
> > corrections could be implemented either in shell scripts or in a new
> > GRASS module. My inclination is to write a GRASS module to do this.
> > Any comments?
>
> There is no map calculator for vector points. However, they can be
> altered using SQL update queries within GRASS. The calculations and
> data manipulations necessary to do the corrections might be better
> implemented in a GRASS module than a shell script.
>
> > NOTE: The last message I sent got garbled up,
> > so please disregard and read this one instead:
>
> Thanks for resending this. I thought it was my email.
>
> >
> > (a) can very easily be converted to (b) by adding coordinates, so it
> > would
> > indeed be easy to have a unified data model.
>
> I agree. That's why I came to the same conclusion about using vector
> points after some thought.
>
> >
> > For data type (a), rasterisation can be done without interpolation and
> > the
> > filters should be run on this, un-interpolated, data.
> > Interpolation will only be necessary in the final step, if the axes
> > and/or
> > resolution of the working location are different than those of the
> > data.
> > So -- just one interpolation in this case.
> >
> > For type (b), interpolation will have to be done right after import,
> > because
> > ** I think ** filters will work better on data which is still properly
> > aligned
> > to the region's X/Y axes. I have a hard time, e.g. imagining a
> > destripe filter
> > work well on stripes running diagonally, which might happen,
> > if the filter is to run on a rectified image (correct me, if this is
> > wrong!).
> > In this case, we would indeed need two interpolations for the final
> > result.
>
> This is somewhat complicated. With respect to interpolation,
> conceptually all of the rasterization should be by interpolation. All
> geophysical measurements are point samples of a continuous field.
> Hence, you need to model the continuous field from the samples. While
> there are a number of ways to do this, interpolation is the most
> common.
>
> If a set of points are orthagonally distributed and equidistant from
> each other, you can certainly directly convert them to raster grids
> (***v.to.rast***). However, this assumes that the field in the space
> between points is constant up to the edge of a grid square (half-way
> between points), where it abruptly changes to the value of the next
> point. In some cases, where orthagonal, equidistant sample points are
> so closely spaced that inter-point variation is irrelevant,
> interpolation is unnecessary. However, I suspect that this is the
> exception rather than the rule. Direct rasterization should probably be
> an option, but interpolation should probably be the norm.
>
> The second issue is what order to interpolate, filter, and
> georeference. In my experience with satellite imagery, georeferencing a
> raster image causes it to lose visual clarity a bit--that is, signals
> lose definition. Hence, signals enhanced by filtering will lose some of
> their visual impact (and possibly quantitative definition and
> separation) if a raster is georeferenced after enhancement. This may or
> may not be a problem, depending on what one is doing with the raster.
> This does not happen, however, if the raster surface model is created
> after the vector points are georeferenced. That is, georeferencing a
> raster will change the values of a raster grid, but georeferencing a
> set of vector points will only change the xy coordinates, not the z
> value.
>
> While most filters should work equally well on georeferenced and
> non-georeferenced data sets, destriping may be another problem. I
> thought that a fft filter destriped by focusing on signal frequency
> rather than orientation, but I have very little experience with this
> and may well be wrong here. I suggest some experiments with filtering
> and georeferencing to see what order produces the best results.
>
> The other reason to georeference early is that georeferened point sets
> (e.g., from multiple sample grids) can be easily combined visually
> simply by displaying and combined into a single analytical file using
> v.patch. The same is true of georeferenced rasters. They can be
> mosaiced by using r.patch. There is no need for the 'r.composite'
> module mentioned.
>
> One potentially useful enhancement of r.mfilter is the ability to run
> the filter across multiple rasters in a single pass. Currently, a
> filter only alters a single raster. This could be enhanced so that
> r.mfilter could be run across a set of rasters specified in the input.
> Also note that r.mfilter allows the specification of multiple filters
> to be run across a raster.
>
> >
> > I would like to keep the grid layout information intact, because I deem
> > it very handy.
>
> This is a good idea. I'd suggest keeping it in a simple text file with
> structured variables much like a GMT or ESRI header file. Thus, each
> data set (i.e., from a grid sample) would be stored in a text file
> named <filename>.grd (or something like that) and a corresponding grid
> definition file would be stored in <filename>.gdf. A *.gdf file might
> have a series of variables like
>
> TITLE
> DATE
> ROWS
> COLUMNS
> ULX
> ULY
> ULXCOORD
> ULYCOORD
> LLX
> LLY
> LLXCOORD
> LLYCOORD
> ...
> and so on.
>
> Such a file could be easily parsed by a module that would read it and
> determine if it was type 1,2, or 3 above and process it accordingly. It
> would be kept associated with the data file by having the same name but
> a different extension.
>
> I hope this is helpful.
>
> MIchael Barton
>
> ____________________
> C. Michael Barton, Professor
> School of Human Origins, Cultures, & Societies
> PO Box 872402
> Arizona State University
> Tempe, AZ 85287-2402
> USA
>
> Phone: 480-965-6262
> Fax: 480-965-7671
> www: <www.public.asu.edu/~cmbarton>
> On Jun 1, 2004, at 10:01 PM, Multiple recipients of list wrote:
>
More information about the grass-user
mailing list