[OSGeo-Discuss] Re: [OSGeo-Standards] TMS and WMTS

creed at opengeospatial.org creed at opengeospatial.org
Fri Apr 9 13:48:12 EDT 2010


Quite the discussion and much appreciated. I have yet to read all of the
emails.

Before I respond to some of the comments, I would like to state that I
really want like to explore how the OSGeo community and the OGC community
can leverage the expertise and positive aspects of both organizations to
create a much stronger community collaboration that benefits all
stakeholders and the broader community. That will be the topic of another
posting.

And apologies for the length of this posting but a couple of comments in
response to some of Allan's valid observations (and the comments he was
responding to).

1. Quite the contrary. I think there are only a handful of people who
really understand each spec document. . . etc.

This really depends on the standard being discussed and moved through the
OGC process. This statement is true perhaps for some OGC standards but not
for others. The breadth of discussion really depends on how narrow the
domain focus is for a specific standard. Allan's statement could also be
applied for work in W3C, OASIS, OMA, the IETF and other standards
organizations. I know because I work standards in all of those venues.

2. On SOAP. Close, but not quite correct. The actual motion approved at
the June 2006 Meetings was: Going forward, all future revisions of
existing and all new OWS (including OLS) interface standards:

1.	SHOULD include an optional SOAP (messaging) binding and
2.	SHOULD express that binding in WSDL.

Exceptions may only be granted through appeal to the OAB. The team working
on a given standard can request a variance with justification. This is
what the OGC Location Services group has done.

3. Complexity. This is a major topic of discussion in the OGC Architecture
Board and other OGC forums. Allan is correct, there is no concerted effort
to makes things complex. However, when there are a myriad of requirements
from multiple domains for functionality in a given interface, apparant
complexity can be the perceived result. Take describing a sensor. Very
easy to define an encoding standard that states that a sensor has a
lat/long location, measures temperature, and stream temperature in degrees
centigrade. But wait, how about a measure of uncertainty, time of last
maintenance, error tolerances, time of observation, and so forth. Suddenly
what started as simple is not so simple any more. And this gets even more
interesting when the sensor is part of an emergency alerting system where
lives are at stake.

That said, the OGC Members have approved and are moving toward
implementing a policy called "core-extension" for OGC standards. The
concept is very similar to a UNIX kernel and extensions/commands related
to the kernel. The idea is that the core is as light-weight, as simple,
and as easy to implement as possible. The first OGC standard that has
fully moved to this model is Web Coverage Service, which is out for public
comment right now. Others are in progress.

We welcome and encourage comments from the OSGeo community!

4. Contrast that with the FOSS approach of debate on a mailing list . . .
Well, no OGC standard goes through the process without considerable debate
(which someone else mentioned can be frustrating due to the knowledgeable
input). Under the current OGC process, any standards development work
requires a team effort. For example, 3 or more members are required to
start work on any standard. As an example, there is a new standards
working group in the OGC for PUCK (a sensor API for ocean observing
systems). There are 6 charter OGC Members (representing government,
commercial, research, and universities) who proposed the work and an
additional dozen OGC members participating. They have regular conference
calls and email debate. They must brief the entire TC on their work. The
OGC Architecture Board will review and provide guidance. The document will
be released for a 30 day public comment period followed by more discussion
and debate. The PUCK team will then need to brief the OGC Technical
Committee at a F2F meeting prior to approval for an electronic vote for
adoption.

Which gets back to a point Allan made about the number of OGC folks who
understand a standard. Only fifteen or twenty individuals in the OGC will
really understand PUCK. But in standards work, there has to be a level of
trust - trust that the individuals working on a given standard know what
they are doing and creating a strong, useful standard. This trust is
identical to working on an open source project - you have trust in your
fellow collaborators. If there is not trust, then in OGC work the standard
may not progress into a vote.

Enough for now.

Regards

Carl




>
> On Apr 7, 2010, at 1:10 PM, Seven (aka Arnulf) wrote:
>
>> -----BEGIN PGP SIGNED MESSAGE-----
>> Hash: SHA1
>>
>> Hey,
>> sorry to spam Discuss with nerdy smalltalk. We might want to move back
>> to the standards list for follow ups.
>>
>> Brian Russo wrote:
>>> I have a simpler/better idea - have the OGC stop creating
>>> unnecessarily complex standards hundreds of pages long that hardly
>>> anyone implements. This will save time/money, and benefit users,
>>> proprietary and open source developers alike.
>>
>> Yes making standards readable and usable is a great idea. It takes time
>> and brains to implement a good standard. I lack both, so not a good
>> choice. How about you? If you are good at this why don't you join the
>> OGC process and help do it better? But watch out, the OGC has a high
>> frustration potential because there is always knowledgeable folks around
>> who pick apart what you just put together. Which is why some standards
>> actually work pretty well.
>
> Quite the contrary. I think there are only a handful of people who really
> understand each spec document. Then when it's time to vote at the TC level
> where on the order of 50+ members have a vote, you're in the realm of
> people who have only the vaguest understanding of the technology coupled
> with a quite narrow view of what their own organization's interests are. A
> memorable vote I observed was one where a proposal to require every spec
> to have a SOAP implementation was put forward, debated, and passed in a TC
> closing plenary with virtually no one understanding the implications.
>
> Contrast that with the FOSS approach of debate on a mailing list or IRC or
> face-to-face at a venue like FOSS4G where a far higher percentage of the
> people engaged in a discussion are actually implementing or using the spec
> being debated.
>
>>
>>> Sometimes I think it's a concerted effort to make sure the 'open'
>>> standards are as complex as possible so few people have the resources
>>
>> Thank you for the laugh but you do not believe this yourself. So why say
>> it? This is exactly the tone I regret in this discussion. OGC is neither
>> a conspiracy nor are they all brain dead. I might be both, conceded, but
>> this is beside the topic.
>
> It's not due to a concerted effort. But the evidence shows that the
> results are generally complex.
>
>>
>>> to implement them (except proprietary vendors and academics with tons
>>> of time) and the rest of us all stick with proprietary standards
>>
>> I guess that you will be of the same opinion as me that an open standard
>> that all can use for free is better than a costly and potentially patent
>> infected proprietary standard that can be changed at the whim of its
>> singular (proprietary vendor) owner. At least I can see a difference.
>>
>>> (because we have the software - the lazy solution), or simple open
>>> ones like GeoRSS-Simple (because a normal person with a normal
>>> schedule can actually understand it).
>>
>> That is another one that should go under the hood of OSGeo, if it is to
>> become of any relevance for example to INSPIRE. If they want to use it,
>> it has to become an ISO standard, else you can't stuff it in a law.
>
> Wait, you can't use a spec that's not an ISO standard? I doubt it. Who
> cares whether you can mandate it or not. If a spec isn't going to get used
> unless it's mandated, maybe it's not such a compelling spec?
>
>> Stupid, aint it?
>>
>>> WMS has been a pretty good success even though I'm sure that'll get
>>> some snickering from the peanut gallery due to its age - it is still a
>>
>> Ah - but a standard is good if it lasts, isn't it? Imagine having http
>> change every half year. Wouldn't that be fun? There sure would be more
>> work to do for us. HTTP 1.1 is 176 pages[1].
>>
>>> common method especially for some older software we support - but when
>>> I look at the list of OGC standards
>>> (http://www.opengeospatial.org/standards), for the most part I see
>>> well-intentioned but effectively irrelevant standards. What happened?
>>
>> Quite a few standards never took off because they are crap. As simple as
>> that. It is somewhat similar to SourceForge. By just looking at 350.000
>> Open Source projects one might be awed. But how many really work?
>>
>>> Sarcasm? Maybe, but WMS 1.3.0 runs in at 84 pages, and is
>>> well-written/concise. Looking at just the GML description gives me a
>>> headache - http://www.opengeospatial.org/standards/gml. It proudly
>>> proclaims it's an ISO standard as well. So what? If it's barely used
>>
>> You are right, GML sucks bad. But it is not true that it is not used.
>> The German Cadastral and Surveying Authorities of the Länder adopted GML
>> as core for the new cadastral base map format. They could only do so
>> because it is an ISO de-jure standard and can only thus become part of a
>> law. Technically it is a huge PITA. Just like cadastral bas maps are.
>> But it also makes sure that folks now don't fall off the plate when they
>> step over the border of their city boundaries, county or state. This is
>> an achievement that neither proprietary vendors nor foss hackers managed
>> in the whole cadastral IT history. The use case is just a bit different
>> than locating the next pizza palace. The cadastre maps ownership - the
>> basis of our whole economy (be it broken or not, this is what we live
>> in, on and off).
>>
>>> it's barely a standard. Maybe my corner of the world is just strange
>>> and elsewhere it's a candyland of people happily plucking geodata from
>>> OGC-standardized data services while riding unicorns, but I don't
>>> think so. I think we're pretty typical.
>>>
>>> OGCification of standards like KML are even more hilarious since
>>> Google Earth is well below ESRI on my list of 'opendata-compliant
>>> software'. Sure lots of people 'support' KML but overwhelmingly they
>>> support some simplified subset of the ~250 page standards document.
>>
>> KML is more interesting from the governance perspective. And I am pretty
>> happy that KML is not owned by Google any more but by the OGC because
>> the OGC is a non-profit organization dedicated to make the world
>> interoperate. When KML came out everybody was full of praise for the
>> pragmatic way of doing things. Now that it is in the OGC it sucks again?
>> Funny.
>
>>
>> Google now has to ask a diverse bunch of spatial experts, geo
>> professionals and neo geographers if their changes to KML are worth
>> pursuing. And OGC makes sure that even you have a say in the public
>> comment period. Not bad, huh? But it gets even cooler. You could be one
>> of those diverse spatial experts, geo professionals and neo geographers
>> and join the process right from the start! If you think it sucks, then
>> you can say so right away. Why wait until many people have invested lots
>> of time and written large incomprehensible documents? You are wasting
>> other people's time by complaining *afterwards*.
>
> Joining the OGC process is time consuming and represents an opportunity
> cost. The time required to understand and leverage the process to
> advantage is not inconsequential.
>
> 	Allan
>
>
>
>
>
> _______________________________________________
> Standards mailing list
> Standards at lists.osgeo.org
> http://lists.osgeo.org/mailman/listinfo/standards
>
>




More information about the Standards mailing list