<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<div class="moz-cite-prefix">
<p>I will not insist for NetCDF, but just want to bring some
clarifications:<br>
</p>
<p>Le 25/11/2019 à 23:44, Even Rouault a écrit :
</p>
</div>
<blockquote type="cite" cite="mid:9014949.cze0rWdGFn@even-i700">
<p class="moz-quote-pre" wrap="">- HDF5 is indeed much more
powerful than GeoTIFF. We would have to restrict even more
severely a profile than I proposed to do with GeoTIFF to avoid
having to deal with crazy formulations. Actually with my hat of
GDAL developer on, this very flexibility of HDF5 is a serious
problem because data producers tend to follow their own personal
inspiration of how to structure the data, and in particular
interoperability of geoferencing encodings is close to null.</p>
</blockquote>
<p>HDF5 is only a binary container. How data are organized inside
that container is defined outside HDF5. In our case, they are
defined by CF-Conventions. This is the same separation than XML
versus schemas: we can see HDF5 as a kind of binary XML + arrays,
and CF-Conventions as a XML schema for NetCDF/HDF. The
CF-Convention addresses the georeferencing encoding in various
ways. One way standardized by CF-Conventions is to use WKT [1].</p>
<p><br>
</p>
<blockquote type="cite" cite="mid:9014949.cze0rWdGFn@even-i700">
<p class="moz-quote-pre" wrap="">the cloud-friendliness of HDF5 is
unknown to me. On the contrary, I know that COG is a technology
used heavily.
</p>
</blockquote>
<p>I did not benchmarked HDF5 myself, but I attended a talk in an
Apache Conference 2 or 3 years ago were such benchmarking in the
cloud had been done for various formats (I do not remember if TIFF
was among them however). HDF5 performances on the cloud were
reported good except for arrays of character strings (a problem of
data chunk size not matching the size of data block transferred on
network).<br>
</p>
<p>Note that HDF5 is in active development (I see 9 commits
yesterday, 82 commits this month), is governed by a Technical
Advisory Board since 2018 [2] and has a massive community-built
software ecosystem according their web site [3]. HDF was initiated
by the National Center for Supercomputing Applications (NCSA) and
is used heavily too (e.g. at NASA). It is the de facto standard in
the scientific and research community.</p>
<p>But as said before I do not insist if the preference is still for
GeoTIFF in PROJ. I just wanted to reply to some of the arguments
that were advanced in this thread.<br>
</p>
<p> Regards<br>
</p>
<p> Martin</p>
<pre>[1] <a class="moz-txt-link-freetext" href="http://cfconventions.org/Data/cf-conventions/cf-conventions-1.7/cf-conventions.html#use-of-the-crs-well-known-text-format">http://cfconventions.org/Data/cf-conventions/cf-conventions-1.7/cf-conventions.html#use-of-the-crs-well-known-text-format</a>
[2] <a class="moz-txt-link-freetext" href="https://confluence.hdfgroup.org/display/HDF5TAB">https://confluence.hdfgroup.org/display/HDF5TAB</a>
[3] <a class="moz-txt-link-freetext" href="https://www.hdfgroup.org/community/">https://www.hdfgroup.org/community/</a>
</pre>
</body>
</html>