[PROJ] Is your the geoid model from your country in PROJ?
Martin Desruisseaux
martin.desruisseaux at geomatys.com
Fri Apr 12 10:27:44 PDT 2024
Hello all
Le 2024-04-12 à 16 h 31, Howard Butler via PROJ a écrit :
> Even's COG-based GTG [1] approach for PROJ supports both an
> incremental network and a local cache distribution model at the same
> time. I've found it frustrating the OGC CRS Standards Working Group
> didn't value a similar approach when it developed the GGXF standard.
> Not only is GGXF content going to require yet another transformation
> for use as an incremental network, its complexity in comparison to GTG
> is going to be challenging for grid producers.
The grid format and the network for distributing them are two separated
things. GGXF addresses only the former. Hosting the files at OGC, EPSG
or somewhere else does not depend on whether those files are in GTG or
GGXF format, and the incremental model used by PROJ (or other software,
caching is not a new concept) is an implementation strategy independent
of the standard.
The complexity argument is disputable. The OGC working group did a
survey before to start the GGXF work, and a majority of data producers
who responded were more familiar with NetCDF than GeoTIFF. In scientific
communities, netCDF and HDF are used a lot. It does not mean that
GeoTIFF is not used, but it is not as dominant as in GDAL communities.
Furthermore, the GeoTIFF format *is* complex. It has 2 or 3 layers of
embedded encoding (GeoKeys inside TIFF tags, then a dictionary of object
names packed in a single GeoKey, then embedded XML documents for
everything not associated to a GeoKey). TIFF is a rigid format,
difficult to extend beyond the 2D imagery for which it was designed, and
the GeoTIFF extension (GeoKeys) is itself difficult to extend as well.
For this reason, many geospatial data (including GTG) distributed as
GeoTIFF files has to resort to XML or JSON auxiliary files. In
comparison, netCDF can be seen as a kind of binary JSON optimized for
multi-dimensional data cubes. It is as flexible as XML or JSON for
storing desired metadata. It has been argued that this flexibility is an
interoperability problem because of different interpretations done by
different software, but the same can be said against XML, JSON, WKT or
even TIFF: before TIFF version 6, a joke was that "TIFF" was the acronym
of "Thousands of Incompatible File Formats". No matter the format, the
only way to address this problem is with a specification as unambiguous
as possible, which may take many versions.
A GeoTIFF problem is that it is very poor in metadata. GGXF needs to
store three CRS (source, target, interpolation). The GeoKeys can store
only one CRS, and the two others have to be stored somewhere else. The
same problem applies to other metadata such as accuracy and
interpolation method. By contrast, GGXF is self-contained: all metadata
are cleanly stored in a single file: no auxiliary files, and no
"TIFF-tags / GeoKeys / yet-another-encoding-packed-in-a-GeoKey /
embedded-XML-document" mix inside the main file.
Another GGXF goal was to be at least as good as NADCON and NTv2 in order
to be a potential replacement for them. The Cloud Optimized GeoTIFF
(COG) convention cannot represent a pyramid of grids with different
resolutions in different regions as NTv2 does (all pyramid levels in a
COG file covers the same region). It would of course be possible to
resolve this problem with more metadata, but the GeoTIFF metadata are
already messy and it would make the situation worst.
GGXF is the result of 2 years (if my memory serve me right) of biweekly
meetings between peoples from EPSG, ESRI, geodesic experts from data
producer agencies and more. It is backed by an abstract specification
posing the theoretical foundations before to address the actual encoding
in a separated document. The encoding document itself it designed in a
way to allow binary encodings other than netCDF in the future if there
is a demand. ESRI is already implementing GGXF support and Apache SIS
will follow hopefully this year. I understand that not adopting GTG may
seem frustrating, but there is not the same level of expertise in those
two formats.
Anyway, GGXF is currently pushed as an exchange format. Software can
consume it directly, but not necessarily. Data producers would publish
their data in GGXF, then PROJ (for example) may convert those files to
GTG for publication on their network. Apache SIS on its side would
consume the GGXF files directly. Having the original data published in
GGXF rather than GTG is advantageous at least for archival purposes,
because of the richer metadata.
Martin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osgeo.org/pipermail/proj/attachments/20240412/326cf47b/attachment.htm>
More information about the PROJ
mailing list