[PROJ] Combining custom transformations with EPSG
Even Rouault
even.rouault at spatialys.com
Tue Nov 3 10:33:43 PST 2020
Hi Kristian,
Happy to see too that we are able to encode custom advanced transformations in
the formalism of the database.
Side notes:
- the ETRF92_2000_TO_ETRF92_1994 transformation would in an ideal world be
encoded as a point motion operation (similarly to EPSG:7957 "Canada velocity
grid v6" that we don't support) if we had ways of doing change of coordinate
epoch in a "cleaner" way than what we do currently, that is that
createOperations() could take a input and output coordinate epoch. But that
would bring new interesting problems regarding on at which step you do the
coordinate epoch change . I remember there's a EPSG guidance note that
examines different scenarios regarding that, but not sure that it adresses all
cases we could hit.
Hum actually looking closer at the way you define this transformation, that
couldn't be a point motion operation as your target CRS code != source CRS,
saying that ETRF92 at 1994.704 == ETRS89 in Denmark.
- ETRF2000_TO_NKG_ETRF00 using a fixed central epoch would need to be another
kind of operation. It would be a bit like the time-specific Helmert
transformation that is used in a few EPSG transformations (but we currently
don't map to a PROJ transformation).
> I haven't
> managed to integrate this completely with the rest of the database. As an
> example, let's
> say I want to transform between ITRF2008 and ETRS89(DK) instead:
> My question is, what have I missed? Should I add more records to the
> database? Or is my
> expectation of PROJ's abilities completely of the mark? I suspect the
> problem might be that
> what I want would require two hub-datums which I assume could be too big of
> a combinatorial task for PROJ to solve in a reasonable time and hence is not
happening.
You guessed right. Figuring out that ITRF2008 to ETRS89 needs to go through
ITRF2014 and then NKG_ETRF00 as intermediate steps is out of reach of PROJ
inference capabilities. It indeed "just" manages one hop. For performance
results mostly and because the SQL requests would be quite horrific (have a
look at AuthorityFactory::createFromCRSCodesWithIntermediates() and
AuthorityFactory::createBetweenGeodeticCRSWithDatumBasedIntermediates() for
the "simple" version with just one intermediate). And with just one hop, in
some cases that can lead to hundreds of results being returned.
So my suggesting would be that you create concatenated operations for
ITRFxxxx -> NKG_ETRF00
chaining ITRFxxxx -> ITRF2014 and ITRF2014 -> NKG_ETRF00
Normally, concatenated operations of concatenated operations aren't supported
by ISO19111. I believe that PROJ could accept that as it relaxes that
constraint, but it would be best to make a slight code change to "flatten" the
hiearchy (otherwise the WKT produced would be non-conformant for example) when
building the object from the database, to have at the end ITRFxxx ->
NKG_ETRF00 being a concatenated operation of single transformations.
The beginning of ConcatenatedOperation::createComputeMetadata() has a logic to
do that flattening. It should probably be moved in a dedicated static method
std::vector<CoordinateOperationNNPtr>
ConcatenatedOperation::flattenOperations(
const std::vector<CoordinateOperationNNPtr> &)
that would be called by the code towards the end of
AuthorityFactory::createCoordinateOperation() before it calls
operation::ConcatenatedOperation::create()
One idea I had in the past to make PROJ inference a bit more powerful, without
going to a full 2 or 3-hops database lookups would be do define "families" of
datums, that is sets of datums that are known to be related by
transformations, which would guide PROJ in infering more transformation paths.
For example the various NAD83(xxxx) datum (especially if NADCON5 is
integrated, which has transformations only between 'consecutive' NAD83(xxxx)
variants, but not between arbitrary pairs of them), or ITRFxxxx, could be such
families. Writing that, it looks a bit like the DatumEnsemble concept, except
that we wouldn't constraint those datums to be considered as "equivalent" for
lower accuracy purposes. Anyway this is just an idea without any
implementation reality right now.
Even
--
Spatialys - Geospatial professional services
http://www.spatialys.com
More information about the PROJ
mailing list