[PROJ] [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

Kristian Evers kreve at sdfe.dk
Tue Jun 25 07:11:02 PDT 2019


> Unless you do a point motion operation, right ?

As long as you work entirely in dynamic coordinates it is okay to change
the timestamps as part of a transformation. When you transform coordinates
from a dynamic/temporal frame to a static frame (e.g. ITRF2014 -> ETRS89)
you need to keep the timestamp from coordinates in ITRF2014 if you want
to retain the ability to transform your data back to ITRF2014. Changing the
timestamp to 1989 or the realization epoch of the local ETRS89 realization
would make it impossible to return to the original coordinates. Unless the
information is kept track of elsewhere. But where would you do that? Worst
case a vector dataset has different observation times for all coordinates.

> Which makes it obvious that, even when/if we have point motion operation 
> available to do directry CRS A @ epoch 1 -> CRS A @ epoch B, we don't have a 
> clean way currently in PROJ of specifying the target coordinate epoch.

I agree with that. We could simply adapt the same convention as used by
geodesists: 

	cs2cs  <source> <destination>@<destination epoch>

I agree that it makes sense to change the coordinate timestamp when
Transforming from one dynamic reference to another. But there's a
caveat:

> Except that 2018.000 in the output should be read as 2025 given the hack...

How do you make the inverse transformation if you change the coordinate
time from 2018.0 to 2025.0? For the particular PJ object that made that
transformation you will not be able to reverse it, since you specified
+t_epoch=2025.0 your inverse operation will now be a null transformation
moving the coordinate from 2025.0 to 2025.0. This is why I opted to not
change the timestamps when I implemented the temporal Helmert.

It is not set in stone that the inverse of an operation should be able to be created
but it definitely makes life simpler. It is a principal decision to make if we want
break the possibility of creating the inverse of a temporal Helmert operation.

> I guess that's why people keep creating new static CRS. That's so much 
> convenient given the inertia on the file format side (how do you fix 
> shapefiles to include coordinate epoch... A new sidecar file myshape.epoch :-) 
> ?)

It is an approach that is absolutely worth considering by geodetic authorities.
It is many times simpler to work with and you still keep the good stuff from
the dynamic frame.


/Kristian

-----Original Message-----
From: Even Rouault <even.rouault at spatialys.com> 
Sent: 25. juni 2019 14:41
To: proj at lists.osgeo.org
Cc: Kristian Evers <kreve at sdfe.dk>; Nick Mein <nick_mein at trimble.com>
Subject: Re: [PROJ] [EXTERNAL] [gdal-dev] Static/Dynamic datum problems

>  The timestamp of a
> coordinate has to be the observation time of the coordinate and the
> timestamp must not be changed during transformations (otherwise you can’t
> do the reverse transformation).

Unless you do a point motion operation, right ?

Quoting http://docs.opengeospatial.org/as/18-005r4/18-005r4.html#68 :
"""The coordinates of a data set may be changed to any other epoch. Plate 
motion or other crustal deformation models are often used for this when 
estimated coordinate velocities are not available. Such models facilitate for 
example the change of coordinates from being referenced to ITRF2008 at epoch 
2017.53 to being referenced to ITRF2008 at epoch 2005.0."""

Which would be the case if you want to create for example a global raster 
dataset : you want all tiles to be referenced to the same coordinate epoch. 
But this would be the final step in the coordinate transformation process.
In a vector dataset, you may accept different features to be at different 
coordinate epochs.

To be noted tha the EPSG dataset is currently rather sparse in the area of 
point motion operations. There's just a single one available for the Canadian 
NAD83(CSRS)v6 case using a velocity grid. So currently if you have a 
coordinate referenced to a ITRF CRS, you can't transform it easily to another 
epoch in this CRS. Or in the Australian case, you'd have to go back to GDA2020 
using the ITRF2014<-->GDA2020 time dependent Helmet transformation... :

From ITRF2014 at 2018.00 to GDA2020
$ echo "-30 120 0 2018.00" | cs2cs -f "%.9f" EPSG:9000 EPSG:7844
-29.999998944 120.000000758 -0.000339830 2018.00

From GDA2020 to ITRF2014 at 2025.00
$ echo "-29.999998944 120.000000758 -0.000339830 2025.00" | \
		 cs2cs -I -f "%.9f" EPSG:9000 EPSG:7844
-29.999996305	120.000002652 -0.001189402 2025.00

Which makes it obvious that, even when/if we have point motion operation 
available to do directry CRS A @ epoch 1 -> CRS A @ epoch B, we don't have a 
clean way currently in PROJ of specifying the target coordinate epoch.

Actually I could do it in just one step by hacking the ITRF2014->GDA2020 
pipeline and using t_epoch=2025.00 to make it a ITRF2014 @ input_epoch -> 
ITRF2014 @ 2025.00 valid for Australia (probably only within a few years 
around epoch 2020.00, which also brings into light the lack in the EPSG 
dataset of metadata indicating the time range against which a coordinate 
operation is valid with the published accuracy...):

$ echo "-30 120 0 2018.00" | cct -d 9 +proj=pipeline \
    +step +proj=axisswap +order=2,1 \
    +step +proj=unitconvert +xy_in=deg +xy_out=rad \
    +step +proj=cart +ellps=GRS80 \
    +step +proj=helmert +x=0 +y=0 +z=0 +rx=0 +ry=0 +rz=0 +s=0 \
     +dx=0 +dy=0 +dz=0 +drx=0.00150379 +dry=0.00118346 +drz=0.00120716 +ds=0 \     
     +t_epoch=2025 +convention=coordinate_frame \
    +step +inv +proj=cart +ellps=GRS80 \
    +step +proj=unitconvert +xy_in=rad +xy_out=deg \
    +step +proj=axisswap +order=2,1
-29.999996305  120.000002652  -0.001189396     2018.0000

Except that 2018.000 in the output should be read as 2025 given the hack...


> The problem is that it is impossible in practice
> since there is no standard that describes how to do this. The OGC Simple
> Features standard which most file formats are based on simply doesn’t
> include time as a dimension.

You probably want the time to be included as a general metadata of the 
geometry rather than a per-vertex value (hopefully a single geometry is 
referenced to the same epoch...), which would be along the CoordinateMetadata 
class mentionned by Martin.
Yes, there is the issue of standards, but then it must percolate down to file 
/ geospatial database formats. GeoPackage and PostGIS for example are 
extensions of the WKB encoding of Simple Features, so if the later is updated, 
then the former might be able to upgrade. For web services, GML / WFS should 
be upgraded as well, both on the response side (you need the coordinate epoch) 
and in the request side (you probably also need to be able to specify that you 
want geometries at a given coordinate epoch)
The issue is the same with raster formats. How to do encode in a GeoTIFF the 
coordinate epoch
(I've just created https://github.com/opengeospatial/geotiff/issues/78), etc 
...

I guess that's why people keep creating new static CRS. That's so much 
convenient given the inertia on the file format side (how do you fix 
shapefiles to include coordinate epoch... A new sidecar file myshape.epoch :-) 
?)

Even

-- 
Spatialys - Geospatial professional services
http://www.spatialys.com


More information about the PROJ mailing list