[PROJ] Static/Dynamic Webmapping Problem version 2.0

Greg Troxel gdt at lexort.com
Wed Jul 17 11:26:38 PDT 2019


Nick Mein <nick_mein at trimble.com> writes:

>> This would be a really good place to insert the velocity of typical
>> stations in your country in the latest WGS84 realization.  Or perhaps
>> the 90th percentile of velocity.   Are you talking 1 cm/year, or is it
>
> All of Australia is moving at approximately 7cm/year relative to the ITRF.

Thanks - that's certainly a lot faster than my ground is moving.  Trying
to ground this in acceptable accuracies and fitness for use of data, it
feels like 5 years of fuzz is not a big deal (35cm) for anything called
"web mapping", as opposed to "GNSS vectors used for geodetic surveying".

> The example that Cameron is using is GDA94 -> GDA2020. The shift is around
> 1.8m.
> http://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/datums-projections/gda2020

I see - a shift between two datums that are each nominally fixed to the
plate.

So I think dealing with this is first a basic issue of having datum
metadata carried alongside coordinates, and everybody has to get that
right if either:

  they are not using essentially WGS84, or
  they care about 2m

Then some subset have to carry epoch of data as well; presumably that's
people who aspire to cm-level accuracy and people with data in a frame
with a significant velocity relative to the plate containing the object
(where significant perhaps means an error of velocity*10y makes them
unhappy).

(In the US, we've largely forgotten NAD27, but the shift in coordinates
from NAD27 to NAD83 is on the order of 40m, at least in New England.
That was not reasonably ignorable even back in 1990.)

OSM doesn't handle this; when importing data from state GIS agencies, we
transform from NAD83 to WGS84 (not sure how well) and then it's just
stored.  But if we are sub-meter, we're pretty happy.

> The situation will be similar (for different reasons) when the USA moves to
> NSRS 2022. The horizontal shift will be around 1.5m.
> https://www.ngs.noaa.gov/datums/newdatums/WhatToExpect.shtml

Thanks for the link; I've downloaded the 3 documents for reading.

>> It seems obvious as new WGS84 realizations happen over time that the
>> database/projection should be interpreted as the recent one, for
>> people that care about meter-level accuracy.
>
> Interesting comment

Perhaps a bit strong on my end, but I was thinking about the
Openstreetmap database, which has always had the notion of "coordinates
of objects are in WGS84".  That seemed like a good choice in 2004, and
my ipmression is that successive realizations of WGS84 are ever closer
to the successive ITRFxx realizations.  Also that the differences in
successive realizations are reasonably small compared to the accuracy
achievable with non-differential GPS.  So were I to try to put accurate
(cm level) coordinates in the OSM database for a point, I'd be trying to
use the most recent WGS84 realization expecting to to more or less be
equal to a recent ITRF.  Looking this up, it seems WGS84(G1762) is more
or less ITRF2008 epoch 2005.0.

Taking a coordinate from the db, I'd treat it as being from the latest
WGS84, but with an unknown accuracy, perhaps from somebody eyeballing
the map and drawing things, and additionally from fuzz between the older
revisions and the current.

OSM has not yet grappled with the concept of mapped objects having
velocities relative to WGS84(Gwwww)/ITRFyyyy.

> To me, "high precision" means cm level, or better, which is why I've
> been saying that there is no such thing as a precise WGS84 coordinate
> (unless perhaps you are working for the US military).

I don't follow the reference to being part of the military but perhaps
there is non-public data.

The US NGS seems to publish coordinates for CORS in IGS08 epcoh 2005.0
and NAD83(2011) epoch 2010.0.  As far as I can tell WGS84(G1762), IGS08
and ITRF08 are essentially the same.

> Although the basic requirements are the same - any spatial dataset
> should include meta-data that (correctly) identifies the reference
> frame, the epoch, and the precision of the data.

Sure.  But that becomes per-object data in a dataset of mixed origin
(which is where my head is because of OSM), and I suspect one then runs
into missing software support.

Trying to get back to point for proj, then, I guess the question is what
operations do people need to do are missing or awkward.  One thing I
wonder about based on your metadata comments and my reaction is the
notion of

  T=createtransform(src_datum, src_epoch, dst_datum, dst_epoch)
  apply_transform(T, vector_of_points[lat, lon, ellipsoidal_height])

versus the notion that the point object (rather than the transform) has
an epoch, to end up with something that feels like

  T=createtransform(src_datum, dst_datum, dst_epoch)
  apply_transform(T, vector_of_points[lat, lon, ellipsoidal_height, epoch])

Probably that's too messy to contemplate for now, without a clear enough
need.


More information about the PROJ mailing list