[gdal-dev] Re: HDF-EOS vs. GDAL: order of dimensions

Ivan Shmakov ivan at theory.asu.ru
Wed Dec 19 12:17:34 EST 2007


>>>>> Andrey Kiselev <dron at ak4719.spb.edu> writes:

[...]

 >> I understand the frustration of dealing with HDF. The format is great,
 >> the hdf library is good and the GDAL HDF4 driver is also good, but
 >> there is lack of documentation on the files itself so what usually
 >> happens is that we, programmers, read the documentation on the web and
 >> hard code it.

 > I am strongly disagree here. I am absolutely sure, that the HDF
 > format is bad. The format is only good for manual processing, but
 > completely unsuitable for automatic processing without user
 > interaction. The dimension order that we are discussing in this
 > thread is just one example of many problems with automatic processing
 > of HDF.

	HDF is hardly the best format, though I believe it's fairly
	acceptable as a format for generic multidimensional data.

	Other issue is the lack of tools to process generic HDF files.
	I've never seen a tool which allows for an HDF file to be viewed
	as a sort of ``dataset archive'' (which it, essentially, is),
	allowing for certain datasets to be added to, removed from or
	moved between HDF files.  (Not to talk about obtaining subsets,
	merging, changing the order of the dimensions, etc.)

 > And HDF-EOS adds even more troubles. In HDF-EOS data fields,
 > dimensions, geolocations and their relations with each other are
 > marked with (sic!) human readable names depicting the physical nature
 > of that field or dimension. It is totally unsuitable for
 > generalization and automation.

	Agreed completely.  Though, in my opinion, it's not the fault of
	the HDF format per se.

 >> The question is: Can we cover all the issues about HDF4 and hard
 >> code then?

 > We can hardcode issues in driver (and we did), but it makes the
 > driver hard to maintain and refactor and it is endless source of
 > bugs. One workaroud can break something in another workaround and we
 > can't catch it, because it is impossible to have a test suite with
 > all kinds of HDF products and check for regressions everytime.

	Besides, what you'd recommend to the users should a brand new
	HDF-EOS product to emerge?  I'm certain that ``updating from the
	SVN trunk'' is barely an option, at least for some users.
	Giving user more control over how the data is to be interpreted
	is the solution.

[...]



More information about the gdal-dev mailing list