The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: November 23, 2009
XML Daily Newslink. Monday, 23 November 2009

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com



Time Short to Agree on Smart-Grid Standards
Martin LaMonica, CNET News.com

"The first crack at vital smart-grid technical standards are due next year and some companies are already gumming up the works by pushing their own networking technology, according to the government official shepherding the process.

The need to hammer out interoperability standards is urgent and the task is extremely complex, according to George Arnold, the national coordinator for smart-grid interoperability at NIST... smart-grid standards need to be agreed on quickly, with the next phase of a multiyear process due next to begin year. Technical interoperability through standards is supposed to safeguard various players, including consumers and utilities, against technical obsolescence and wasted investment. About $8.1 billion of federal, state, and industry money will be spent on upgrading the electricity grid in the next three years...

NIST is looking at the Internet standards as a model for how the process should be operated. Last week, there was an event called Grid-Interop where a governing panel was created specifically to focus on interoperability: 'Over time this organization (called the Smart Grid Interoperability Panel) is going to become something like the Internet architecture board. It's not being set up to develop standards. It's really being set up to develop the overall architecture and select which standards should be used'..."

See also: the NIST Smart Grid Interoperability Panel announcement


ZigBee Alliance and DLMS Collaborate on Metering Data Compatibility
Staff, DDJ News

The ZigBee Alliance, a global ecosystem of companies creating wireless solutions for use in energy management, commercial and consumer applications announced a liaison agreement with the Device Language Message Specification (DLMS) User Association to collaborate on metering data compatibility. The two groups will define a method to transport IEC standard DLMS/COSEM messages sending metering data through ZigBee Smart Energy networks. The result will expand ZigBee Smart Energy to support complex metering applications, and will provide utilities and energy service providers with a standard for a variety of Smart Grid energy management and efficiency programs and services...

ZigBee Smart Energy improves energy efficiency by allowing consumers to choose interoperable products from different manufacturers giving them the tools to manage their energy consumption more precisely using automation and near real-time information. It also helps utility companies implement new advanced metering and demand response programs to drive greater energy management and efficiency, while responding to changing government requirements.

The DLMS User Association maintains a liaison with IEC TC 13 WG 14, the standardization body responsible for establishing standards for electricity meter data exchange. The User Association provides registration and maintenance services for the IEC 62056 DLMS / COSEM standards suite, performs pre-standardization work. In addition, the DLMS UA operates the conformance testing scheme. The DLMS/COSEM standard suite (IEC 62056 / EN 13757-1) is the most widely accepted international standard for utility meter data exchange..."

See also: ZigBee Smart Energy


Managed Energy, Collaborative Energy, and Autonomous Load
Toby Considine, AutomatedBuildings.com

"There are two fundamentally different approaches to Energy Interoperation: managed energy and collaborative energy. Under managed energy, the energy provider directly manages the devices and systems in the end nodes using direct control signals. Utilities designed early smart grid deployments to communicate with the smallest, cheapest systems, ones that can fit easily into appliances and home thermostats. Social equity concerns, i.e., mandates that low income consumers have access to the benefits of smart energy, dictated that these devices could not materially affect the price appliances. Consumers want reliable systems; it is hard to convince them to pay more for systems that can be turned off by someone else...

Utilities often refer to this approach as the Residential option. Occasionally they refer to it as the ZigBee approach, because that trade association is the primary technology used to install these low end systems. Others may call it the OpenHAN (Home Area Network) approach, although the information and interactions are indistinguishable from those of ZigBee. Managed energy is also used for some small commercial buildings. Collaborative energy relies on clear price signals and market interactions to engage the occupants of the end nodes in active participation in energy. Today, early forms of collaborative energy are in operation, the result of custom engineering and proprietary signals. In a few places, these buildings receive signals form OpenADR (automated demand response). OpenADR is a hybrid signal, halfway between managed and collaborative energy...

The key standards committees of collaborative energy are making loose, light, and market-centric interfaces. EMIX (Energy Market Information Exchange) communicates price and product descriptions—and prices and products vary over time. Energy Interoperation, the successor to OpenADR, defines the interactions between the grid and end nodes, and includes distributed energy resources (DER) (site-based storage and generation) as well as demand response. WS-Calendar will define the time and interval aspects of the above standards. We plan to incorporate WS-Calendar into oBIX as well..."

Update on WS-Calendar: "OASIS Web Services Calendar (WS-Calendar) TC to Create Common Scheduling Standard."

See also: the OASIS Energy Market Information Exchange (eMIX) TC


Update on Government Adoption of OpenID and Certification Initiative
Don Thibeau, OpenID List Posting

An OpenID list posting by Don Thibeau (OpenID Foundation Executive Director) reports on recent developments with the OpenID certification initiative: "Since March 2009, the OpenID and the Information Card Foundations have collaborated on responding to US government identity standards adoption and certification requirements... As a result of following the government's Identity Scheme Adoption Process (ISAP) and Trust Framework Provider Adoption Process (TFPAP) process, the OpenID and information Card profiles have been completed under the ISAP process...

Two weeks ago at the OpenID Summit and again at the Internet Identity Workshop (IIW), we asked the community at large to help design our approach, challenge our assumptions and focus our vision. Immediately after IIW, the Boards of Directors of the OpenID Foundation and the Information Card Foundation agreed to form a joint steering committee (JSC) to refine strategic goals, investigate operational alternatives, and guide deployment planning for what we have called the Open Identity Framework or OIF...

On behalf of the JCS, a request for information was sent today to Kantara, OASIS, Protiviti, InCommon, Global Inventures, and FuGen with an information copy to VeriSign. This request for information has three objectives; to solicit informed collaboration, to identify a short list of potential partners and continue to evolve our thinking... The JSC selection criteria are likely to focus on cost efficiencies, execution synergies and compatible business models...

Our next public exposure of these concepts is at the National Institutes of Health (NIH) forum on 'Identity and Trust: Enabling Collaboration in a Connected World' on December 10, 2009. The purpose of this forum is to educate the NIH and government communities about federal-wide efforts to enable identity management to collaborate in new ways. We plan to make the case that open identity standards such as OpenID and Information Card will allow users-both within the government and in academia and the research community-to use a single set of credentials to access a variety of electronic resources at NIH and beyond..."

See also: Nico Popp on trust assurance


OpenID v.Next Goals
Mike Jones, Blog 'Musings on Digital Identity'

"The OpenID v.Next session at IIW run by David Recordon and Dick Hardt reached some important conclusions about the future of OpenID. The motivation for the v.Next discussion was the sense that we've learned enough since the OpenID 2.0 specification was finalized that it's time to revise the specification to incorporate what we've learned. This session attempted to reach a consensus on the priorities for the next version of OpenID, with a large number of the important players participating. I haven't seen the decisions made published elsewhere, so I'm recording them here.

David organized the session around a stated goal of producing an evolved OpenID specification within the next six months. The consensus goals reached were as follows [features listing]. The numbers represent the number of participants who said that they would work on that feature in the next six months. There was also an explicit consensus in the room that OpenID v.Next would not be backwards compatible with OpenID 2.0.

Features targeted in OpenID v.Next: (1) Integrating the UX extension (in which the user interacts with the OP in a pop-up window) into the core specification: 12; (2) Evolving the discovery specification for OpenID, including adding OpenIDs using e-mail address syntax: 10; (3) Integrating attributes (claims) into the core specification: 9; (4) Integrating the OAuth Hybrid specification into the core specification: 8; (5) Supporting an optional active client (identity selector) and non-browser applications: 8; (6) Improve security, including investigating enabling use at levels of assurance above NIST level 1: 8; (7) Better support for mobile devices: 8; (8) Addressing the problem of long URLs (where browsers limit URL length to 2048 or sometimes 256 characters): 6..."

See also: the OpenID.net web site


Link Relations for Simple Version Navigation
Al Brown, Geoffrey Clemm, Julian F. Reschke (eds), IETF Internet Draft

Final feedback is now requested by the editors of Link Relations for Simple Version Navigation, presented in version -03 of the IETF Internet Draft: "At this point we would like to ask the community for final feedback; we are planning to request publication in two weeks from now, on December 04, 2009." Recent changes compared to the previous draft: we changed the terminology so that it can be used for CMIS/AtomPub, WebDAV, and JCR (Java Content Repository); we also added an informative appendix that shows how the link relation could be used inside the HTTP 'Link' header in the context of WebDAV.

This specification "defines five link relations that may be used on a resource that exists in a system that supports versioning to navigate among the different resources available, such as past versions. (1) The 'version-history' link relation, when included on a versioned resource, is a link pointing to a resource containing the version history for this resource. (2) The 'latest-version' link relation, when included on a versioned resource, points to a resource containing the latest (e.g., current) version. The latest version is defined by the system. For linear versioning systems, this is probably the latest version by timestamp. For systems that support branching, there will be multiple latest versions, one for each branch in the version history. (3) The 'working-copy' link relation, when included on a versioned resource, is a link pointing to a working copy for this resource, where some systems may allow multiple of these link relations.

(4) The 'predecessor-version' link relation, when included on a versioned resource, links to a resource containing the predecessor version in the version history; some systems may allow multiple of these link relations in the case of a multiple branches merging. (5) The 'successor-version' link relation, when included on a versioned resource, points to a resource containing the successor version in the version history.

For background, see Atom Link Relations registry (also in XML and plain text formats), and compare Link Relations for Addressing.

See also: CMIS references


W3C Issues Last Call Review for XMLHttpRequest Specification
Anne van Kesteren (ed), W3C Technical Report

Members of the W3C Web Applications Working Group have published a Last Call Working Draft for the specification XMLHttpRequest. Public comments are invited through 16 December 16, 2009. The 'XMLHttpRequest' specification defines an API that provides scripted client functionality for transferring data between a client and a server.

"The name of the object is XMLHttpRequest for compatibility with the Web, though each component of this name is potentially misleading. First, the object supports any text based format, including XML. Second, it can be used to make requests over both HTTP and HTTPS (some implementations support protocols in addition to HTTP and HTTPS, but that functionality is not covered by this specification). Finally, it supports 'requests' in a broad sense of the term as it pertains to HTTP; namely all activity involved with HTTP requests or responses for the defined HTTP methods...

See also: the XMLHttpRequest specification


How Fuzzy Should a Date Be? Extended Date Time Format (EDTF)
Rick Jelliffe, O'Reilly Technical

"Bruce D'Arcus' has provided a pointer on a U.S. Library of Congress (LOC) initiative for a better date format: Extended Date Time Format (EDTF). W3C XML Schemas 1.0 datatypes does use a subset of ISO8601, in fact multiple subsets: 'xs:date', 'xs:dateTime', 'xs:gDay', and so on. They are not derived from one another because their value spaces are different... The XSD 1.1 CR specification has a very useful section discussing date/time issues: it quite nicely analyzes things in terms of a seven-property model. However, the analytical model is not reflected in the declarative capabilities of XSD datatypes, and I think this is something that could be looked at.

It is all part of the issue of compound datatypes, which is what draft ISO DSDL Part 5 Datatype Library Language (DTLL) is supposed to address. The flaw with XSD Datatypes is not so much the type system (facets and type derivation by restriction seem innocuous) but the type generation system... What the EDTF people seem to be wanting is something even more than DTLL may offer: they seem to want some measure of fuzziness or wildcarding, for example 'Some year between 2000 and 2099' [20??]. year and month, questionable [2004-06?], year and month, approximate [2004-06~]..."

From the EDTF web site: "There is no standard date/time format that meets the needs of various well-known XML metadata schemas, for example MODS, METS, PREMIS, etc. For several years there have been various discussions about developing a reasonably comprehensive date/time definition for the bibliographic community, and submitting it either for standardization or some other mode of formalization - a W3C note for example, a NISO Profile, and/or an amendment to ISO 8601... Many dates are coded in database records without hyphens (conformant with ISO 8601). When extracting a date from a database record to insert into an XML record, some implementors feel it is an unnecessary burden to have to insert hyphens. Times are often encoded without colons. Year/month (without the day) needs to be represented, as do date ranges..."

See also: the Library of Congress EDTF web site


Using POST to Add Members to WebDAV Collections
Julian F. Reschke (ed), IETF Internet Draft

An updated Internet Draft has been published for Using POST to Add Members to Web Distributed Authoring and Versioning (WebDAV) Collections. Open issue: whether the specification should add support for creating collections as well (CalDAV and/or CardDAV clients creating calendars or address books might prefer to leave specification of the resource name to the client).

"The Hypertext Transfer Protocol (HTTP) Extensions for the Web Distributed Authoring and Versioning (WebDAV) do not define the behavior for the 'POST' method when applied to collections, as the base specification (HTTP) leaves implementers lots of freedom for the semantics of 'POST'. This has led to a situation where many WebDAV servers do not implement POST for collections at all, although it is well suited to be used for the purpose of adding new members to a collection, where the server remains in control of the newly assigned URL. As a matter of fact, the Atom Publishing Protocol (AtomPub) uses POST exactly for that purpose. On the other hand, WebDAV-based protocols such as the Calendar Extensions to WebDAV (CalDAV) frequently require clients to pick a unique URL, although the server could easily perform that task...

This specification defines a discovery mechanism through which servers can advertise support for POST requests with the aforementioned 'add collection member' semantics. [Currently] the specification deliberately only adresses the use case of creating new non-collection resources, and that it was not a goal to supply the same functionality for creating collection resources (MKCOL), or for other operations that require the client to specify a new URL (LOCK, MOVE, or COPY)..."

See also: Greenbytes WebDAV News


Microsoft Begins Paving Path for IT and Cloud Integration
John Fontana, InfoWorld

"Microsoft has launched its first serious effort to build IT into its cloud plans by introducing technologies that help connect existing corporate networks and cloud services to make them look like a single infrastructure... Microsoft's goal is to supply tools, middleware, and services so users can run applications that span corporate and cloud networks, especially those built with Microsoft's Azure cloud operating system...

Project Sydney creates a sort of virtual network that ties together pieces of an application or processes running in various places so they all looks like one logical system. Microsoft showed a demo of Sydney as part of an internal auction application that incorporates a database running on-premises at Microsoft and a front end hosted in the cloud, where the performance is supplied to handle the churn of auction bids before depositing the final result in the internal database. Another key piece introduced is the AppFabric, an application server layer that spans the cloud and internal servers so developers have a single, consistent environment for .Net applications. The AppFabric combines hosting and caching technologies formerly code-named Dublin and Velocity. Beta 1 for Windows Server 2008 R2 was released last week and a beta for Azure will come in 2010...

Microsoft said the platform would span operating systems (Windows and Azure), relational databases (SQL Server and SQL Azure) application services (AppFabric), programming models (.Net), and applications (including both internal and cloud versions of Exchange, SharePoint, and Dynamics)..."

See also: Windows Azure


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2009-11-23.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org