This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com
- Updated IETF Internet Draft: Using OAuth for Recursive Delegation
- RFC for OGC Sensor Planning Service Implementation Standard Version 2.0
- SANY: An Open Service Architecture for Sensor Networks
- Report from W3C Advisory Committee: Update on HTML 5 Document License
- Nuxeo EP ECM Platform and Nuxeo DM 5.3.1 Support Default CMIS Connector
- Three Layer Model for XML with Schematron: Analytical/Practical/Pragmatic
- Vendors Clash Over Lack of Standards at VoiceCon
Updated IETF Internet Draft: Using OAuth for Recursive Delegation
Bart Vrancken and Zachary Zeltsan (eds), IETF Internet Draft
Members of the IETF Open Authentication Protocol (OAuth) Working Group have released a revision of the specification Using OAuth for Recursive Delegation, updating the previous Internet Draft published on September 05, 2009. The document describes a use case for delegating authorization by a Resource Owner to another user via a Client using the OAuth protocol. OAuth allows Clients to access server resources on behalf of another party (such as a different Client or an end-user). This document describes the use of OAuth for delegating one Client's authorization to another Client—a scenario, which is also known as four-legged authorization."
Details: "The need for documenting a use case for the OAuth multi-layered authorization was discussed on the OAuth mailing list and at the bar BoF meeting at the IETF 75. This Internet Draft describes such a use case... The OAuth protocol provides a method for servers to allow third-party access to protected resources without forcing their end-users to reveal their authentication credentials. This method can be employed to support organizing and sharing information among the end-users.
For example, a Web user (Resource Owner) can grant data access to a pre-defined set of users. This can be done with the use of a special OAuth Client — content manager — which serves as a proxy between the end-users and the Web servers that host the resources related to the project. The content manager allows a user (the owner of the resources) to specify a set of the resources related to a project (e.g., by tagging) and a set of the users and their access rights in respect to the resources. The content manager may also enable searching of the related materials... OAuth uses a set of token credentials to represent the authorization granted to the Client by the Resource Owner. Typically, token credentials are issued by the Server at the Resource Owner's request, after authenticating the Resource Owner's identity using its Server credentials (usually a username and password pair)...
The specification The OAuth Protocol: Web Delegation defines a method for provisioning the token credentials with the use of the HTTP redirection and the Resource Owner's user agent. This document describes how the method can be expanded to allow a Client to share a resource with another Client after obtaining the Resource Owner's authorization to do so..."
RFC for OGC Sensor Planning Service Implementation Standard Version 2.0
Ingo Simonis and Johannes Echterhoff (eds), OGC Implementation Specification RFC
Members of the Open Geospatial Consortium have published a draft version of the OGC Sensor Planning Service Implementation Standard Version 2.0 with request for comment through April 14, 2010. Contributing organizations include International Geospatial Services Institute GmbH (iGSI), Spot Image S.A., and SeiCorp, Inc. The OpenGIS Abstract Specification does not require changes to accommodate the technical contents of this document. Normative Annex B supplies the XML Schema Documents. Informative Annex F provides example XML documents: the editors use a simplified web cam to illustrate the various requests and responses, where the first request shows the full SOAP envelope to illustrate its usage. Normative Annex D (GML dictionary of SPS event codes) contains the XML-baesed GML dictionary with the definition of codes defined in the EventCode type. The proposed OGC SPS 2.0 standard and information on submitting comments on this document are available online, including Word/PDF format prose specification (119 pages) and ZIP archive with thirteen schema files.
This draft OpenGIS Implementation Standard (OGC 09-000) specifies interfaces for tasking a sensor. The Sensor Planning Service is part of the OGC Sensor Web Enablement document suite. The standard is designed to support queries that have the following purposes: to determine the feasibility of a sensor tasking request; to submit such a request; to inquire about the status of such a request; to update or cancel such a request; and to request information about other OGC Web services that provide access to the data collected by the requested task... Model elements are specified in platform-neutral fashion first, using tables that serve as data dictionaries for the UML model Platform-specific encodings of these model elements are provided in separate clauses or documents. The XML Schema encoding has automatically been generated using the rules defined in clause 24 of OGC 09-001. The document specifies platform-specific encodings appropriate for a SOAP/WSDL operation binding. However, the model as well as its XML Schema encoding (and other data) can be used by other bindings as well, like REST(ful) or POX (Plain Old XML) over HTTP (using XML or KVP encoding).
Details: "The Sensor Planning Service (SPS) is intended to provide a standard interface to collection assets (i.e., sensors, and other information gathering assets) and to the support systems that surround them. Not only will different kinds of assets with differing capabilities be supported, but also different kinds of request processing systems... The SPS is designed to be flexible enough to handle such a wide variety of configurations. It is applicable to all use cases in which one or more sensors or sensor systems can or need to be parameterized in order to influence the measurement process and therefore the information gathered by these assets or systems. This specification provides an abstract overview of the SPS before describing the information model for operation requests and responses in a platform-neutral manner and subsequently applying this model to a specific binding (SOAP in this case)...
The interfaces provide functionality to: (1) retrieve metadata about the service—useful for determining service capabilities; (2) describe the parameterization options available for a sensor; (3) check if the service is capable of performing a planned task; (4) reserving the resources needed to perform a planned task for a certain amount of time—useful for handling combined tasking of multiple sensors; (5) instruct the service to execute a task for a sensor; (6) retrieve the status of a task; (7) update a task—for example to change certain parameters; (8) retrieve information about access to the data collected by a sensor—also on a per-task basis; (9) cancel a task—useful to free resources so that they can be put to better use by others. The document leverages functionality defined by other standards which enables [i] provisioning and management of sensor descriptions, [ii] publication of and subscription for information on events recognized by the service—for example to automatically notify clients of new information on their task...
See also: the complete ZIP package for the RFC
SANY: An Open Service Architecture for Sensor Networks
M. Klopfer and I. Simonis (eds), Sensors Anywhere (SANY) Consortium
"SANY (Sensors Anywhere) embraces trends and approaches identified by ORCHESTRA, many of which have by now developed into reality. As a major Integrated Project in the Sixth Framework Programme of the European Commission, SANY extends the work of ORCHESTRA into the domain of sensor networks and standards based sensor web enablement. The Open Geospatial Consortium has recommended the publication SANY: An Open Service Architecture for Sensor Networks as excellent introduction to OGC's Sensor Web Enablement (SWE) standards, which enable developers to make all types of sensors, transducers and sensor data repositories discoverable, accessible and useable via the Web.
Book excerpts: "The Sensor Web started as a conceptual design study several years ago. Today, though far from being complete, it is instantiated. Hundreds of sensors and other components already contribute to the Sensor Web and the number is continuously growing... Sensors and sensor networks are connected and accessible via the World Wide Web. Access to sensor information and observations will be achieved through standardized Web service interfaces. Sensors are self-describing to both humans and software alike, using standard (non-proprietary) encodings. Thus, these sensors and ultimately their data will be discoverable. Much like search engines are capable of finding content in web pages across the globe, the Sensor Web provides components to search for specific sensors and sensor data—of the past, present and future.
Through standardized Web service interfaces, sensors, simulations, and models will be capable of being configured and tasked dynamically. Software will be able to geolocate and process observations from newly discovered sensors without a priori knowledge of the sensor system that generated the observations. New and higher-level information will be generated on-the-fly based upon the vast source of sensor data now available. All this information will be distributed and alerts be raised when events of interest are detected, enabling the initiation of responsive action, even automatically. Sensors will be able to act on their own (i.e., autonomous), even in concert, based upon the rich offer of information about their environment...
As all components of the Sensor Web (such as sensors, access interfaces, data stores etc.) are operated and maintained by different organizations, it is a set of common agreements that bootstraps the Sensor Web. Standardisation organizations coordinate the process of finding common ground and mutual agreements among experts and sensor operators from various domains. The goal is to develop a framework of standards generic enough to support a wide field of applications while remaining specific enough to ensure interoperability among all participating components. The Sensor Web builds on the World Wide Web and uses a wide variety of standards recommended by the W3C, such as XML, XML Schema or SOAP for data encodings and interface specifications. Using the Web as its foundation layer, the Sensor Web makes use of Web technologies and supports the integration of communication infrastructures taking place on lower levels of the communication stack, often using standards developed IEEE or the Internet Engineering Task Force (IETF)..."
See also: the OGC announcement
Report from W3C Advisory Committee: Update on HTML 5 Document License
Ian Jacob, W3C Blog
"Today at the W3C Advisory Committee meeting, we discussed the document license for HTML 5. We discussed use cases from the HTML Working Group that call for a more open license than the current W3C Document License. The result of discussion among the Membership is that there is strong support for: (1) a license that allows the reuse of excerpts in software, software documentation, test suites, and other scenarios; (2) a license (or licenses) that are familiar to the open source community; (3) processes that encourage innovation and experimentation about Web technology, so that work can be easily brought to W3C for standardization; (4) making the HTML Working Group a forum that is conducive to participation by the community at large; (5) ensuring that the HTML 5 specification remains valuable to the entire Web community — see the update from Philippe Le Hégaret on HTML presented to the Membership.
In short, there is strong support in the Membership (but not unanimity) for all of the use cases cited by the HTML Working Group except forking the specification. Several W3C Members do feel strongly that the document license should allow forking, however. People at the meeting agreed that, in any case, copyright is not likely to prevent fragmentation... people do not expect copyright to be instrumental to the successful deployment of HTML 5. Rather, quality and market relevance will determine whether the W3C specification is successful. I innovation and experimentation are valued at W3C. Jeff Jaffe, W3C's new CEO, has already blogged about the fact that W3C should encourage participation from more developers as they are significant drivers of innovation..."
See also: the author's presentation slides
Nuxeo EP ECM Platform and Nuxeo DM 5.3.1 Support Default CMIS Connector
Stefane Fermigier, Software Announcement
Software developers at Nuxeo have announced the release of ECM platform software, Nuxeo EP, and the collaborative document management application built with the platform, Nuxeo DM. This is the first release to directly include CMIS support, whereas earlier support was provided by an add-on. Nuxeo DM now comes with the CMIS connector by default.
This new release mainly brings improvements and bug fixes to the software. We have improved existing components and services, but very few new API have been added. This new version is fully backward compatible, hence upgrade is painless and requires no data migration or code change. Major improvements, in addition to the default CMIS connector : (1) OpenSocial support has been improved and upgraded with OpenSocial 0.9 and OAuth support, together with more compatibility testing (2) The default user interface is more configurable thanks to generalized usage of Layouts and Nuxeo Studio; (3) The storage engine (VCS) has been improved and optimized for higher data volumes.
Nuxeo has been working both on the specification effort on the CMIS (Content Management Interoperability Standard) standard, on a Java library under the auspices of the Apache Software Foundation (Project Chemistry), and on the implementation of CMIS on top of a Nuxeo content application.
See also: CMIS references
Three Layer Model for XML with Schematron: Analytical/Practical/Pragmatic
Rick Jelliffe, O'Reilly Technical
Scenario: a large XML implementation, people worried that they had to do their data modeling using XSD schema when they thought it might be better to use err a modeling language... "where does Schematron fit in? So I made up this little diagram... The big difference in the model in this diagram and conventional ways of thinking about schemas, is the role of the non-Schematron schema languages: they are limited to providing limited pragmatic information, certainly not being used as data models. I often see that people want to use the schema in an analytical position: it does not have to be there, and may not be a good fit there...
First, in the analytical layer we create a glossary which lists and defines all the objects the system has. We may use a UML diagram for this, for connected to Use Cases through a Traceability diagram. Then the business requirements lists and defines all the relationships between the different objects.
Next, we have the practical layer, where we have XML instances that implement the objects, and a Schematron schema that implements the business requirements rules.
Finally, we have, if needed, the pragmatic layer. This takes care of any stray issues that relate to how an XML document is transmitted or stored or displayed. For example, we might want to store some of the data in a DBMS, so we would like constrain the field lengths of certain information items. These lengths have nothing to do with any specific business requirement, and might only impact some systems. They are merely constraints necessary to fit in with some particular extrinsic technology: XML file serialization, DOM object creation, realational data mapping, and so on.The grammar-based schema languages such as RELAX NG and XSD fit in here."
See also: the XML Three Layer Model diagram
Vendors Clash Over Lack of Standards at VoiceCon
Mike Fratto, InformationWeek
"The lack of interoperability among Unified Communications systems has sparked discussion at the conference's session on next-gen architectures. The salient message that came out of Next Generation Communications Architectures general session at VoiceCon 2010 on Monday came from Jack Jachner, vice president of cloud communications for Alcatel-Lucent, who said: 'You should choose your communications system from a communications company.' Jachner wants that communications company to be Alcatel-Lucent, but that sentiment can be applied to nearly every other vendor represented on the panel, because there is little interoperability between communications systems.
Phil Edholm, vice president for strategy and innovation at Avaya, was a bit bleaker, saying that unified communications today is a like two cans and a string. He sees the value of UC developing in 3-5 years, with services providing advanced services like conference moderation, intelligent device detection, and better, easier collaboration between end-users. Essentially adding services and options that make on-line collaboration as easy as picking up a phone.
UC (Unified Communications) is very much in its infancy, where a vendor or an integrator like Cisco, HP, IBM or Microsoft has to certify different vendor products like VoIP phones, video servers and displays, and UC software to ensure that the products can interoperate together and figure out their own best practices for deploying the systems live..."
XML Daily Newslink and Cover Pages sponsored by:
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: firstname.lastname@example.org
Newsletter unsubscribe: email@example.com
Newsletter help: firstname.lastname@example.org
Cover Pages: http://xml.coverpages.org/