A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover
This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com
Headlines
- W3C First Public Working Draft: Web Services Fragment (WS-Fragment)
- FDA Releases Draft Guidance on SPL Standard for Content of Labeling
- Online Proceedings: Workshop on Semantic Web in Provenance Management
- Standards in Industry: The MPEG Open Access Application Format
- Windows Management Framework: PowerShell 2.0, WinRM 2.0, and BITS 4.0
- Salmon Protocol to Be Built on Top of Atom + Extensions
- Why IP is the Right Choice for Smart Grid
W3C First Public Working Draft: Web Services Fragment (WS-Fragment)
Doug Davis, Ashok Malhotra, Katy Warr, Wu Chou (eds); W3C Technical Report
A First Public Working Draft has been published for the Web Services Fragment (WS-Fragment) specification. The document was produced by members of the Web Services Resource Access Working Group (WG), which is part of the W3C Web Services Activity.
"This specification extends the WS-Transfer specification and defines a mechanism that allows clients to retrieve and manipulate subsets (parts or fragments) of a WS-Transfer enabled resource without needing to include the entire XML representation in a message exchange. The specification defines a fragment transfer mechanism, an extension framework for defining expression languages, and a set of expression languages. The fragment transfer mechanism is defined as an extension to WS-Transfer. This involves defining a WS-Transfer Dialect and corresponding XML elements that go into the SOAP Body of the 'Get' and 'Put' WS-Transfer operations. This fragment transfer mechanism is designed so that it can be used with any number of expression languages to indentify a subset of the resource the operation is to operate on. While other specifications can define other expression languages, it is recommended that those languages reuse the fragment transfer framework that this specification defines...
WS-Transfer defines what the expected behavior of a resource is with respect to modifications of the resource that might result in an invalid state or if the client does not have the authority to perform such operations. The specification only extends but does not modify the base WS-Transfer behavior..."
See also: the Web Services Resource Access Working Group
FDA Releases Draft Guidance on SPL Standard for Content of Labeling
Staff, U.S. Food and Drug Administration Guidance Document
On October 27, 2009, the U.S. Department of Health and Human Services Food and Drug Administration (FDA), via the Center for Drug Evaluation and Research (CDER) and Center for Biologics Evaluation and Research (CBER), published Revision 1 of "Guidance for Industry SPL Standard for Content of Labeling, Technical Q&A. The document was prepared by the Office of Critical Path Programs in the Office of the Commissioner at the Food and Drug Administration.
"This guidance is intended to assist applicants who submit content of labeling to FDA as part of a marketing application using the Structured Product Labeling Standard (SPL) in Extensible Markup Language (XML). The guidance also provides information to FDA staff who review and manage product information using electronic systems. This is Revision 1 of a guidance of the same name that was issued in December 2005. The guidance has been revised to reflect changes in the technology since 2005 and to harmonize the submission of SPL in the Center for Biologics Evaluation and Research (CBER) and the Center for Drug Evaluation and Research (CDER). We anticipate that additional guidance will be provided as new questions arise about the use of SPL in different contexts..."
The Structured Product Labeling Standard is an HL7 Version 3 standard for electronic submissions in XML format. "Often known as 'product labels', 'package inserts', or 'prescribing information', these documents contain the authorized published information that accompanies any medicine licensed by a medicines licensing authority. The SPL specification is a document markup standard that specifies the structure and semantics of these documents and is generally analogous to the HL7 Clinical Document Architecture (CDA) although there are some fundamental differences between the two. This standard may be of interest to regulatory authorities or any organization that is required by law to submit a product information document because it is responsible for the creation or marketing of a product, or any other person or organization compelled by other motives to submit information about products, whether originally created or not..."
See also: SPL references
Online Proceedings: Workshop on Semantic Web in Provenance Management
Paolo Missier, SWPM Workshop Announcement
On behalf of the organizers of the First International Workshop on the role of Semantic Web in Provenance Management (SWPM-2009), Paolo Missier announced the availability of online presentations and proceedings from the workshop, held October 25, 2009.
"The primary objective of this workshop was to explore the role of Semantic Web and its standards in addressing some of the critical challenges facing provenance management, namely: (1) Efficiently capturing and propagating provenance information as data is processed, fragmented and recombined across multiple applications and domains on a Web scale; (2) A common representation model for provenance, underpinned by a formal theory for use by both agents and humans; (3) Interoperability of provenance information generated in distributed environments such as the Web and myGrid; (4) Tools leveraging the Semantic Web for visualization of provenance information.
The term 'provenance' describes the lineage, or origins, of a data entity. Provenance metadata is required to correctly interpret the results of a process execution, to validate data processing tools, to verify the quality of data, and to associate measures of trust to the data. The growing eScience infrastructure is enabling scientists to generate scientific data on an industrial scale. Similarly, the Web 2.0 paradigm is enabling Web users to create focused applications that combine data from multiple sources, popularly referred to as 'mashups', on an extremely large scale. The importance of managing various forms of apparently ancillary metadata, in addition to the primary data products of eScience, Web, and business applications is increasingly being recognized as critical for the correct interpretation of the data..."
Related: W3C recently chartered new work in the Provenance Incubator Group to provide a state-of-the art understanding and develop a roadmap in the area of provenance for Semantic Web technologies, development, and possible standardization.
See also: the W3C Provenance Incubator Group
Standards in Industry: The MPEG Open Access Application Format
F. Schreiner, K. Diepold, M. Abo El-Fotouh, T. Kim; IEEE MultiMedia
For many years, the creation and dissemination of freely distributable digital content has been growing remarkably. Ideas emerging from the Open Source Initiative and elsewhere have helped create the concept of open content, which is analogous to open source software. Open content is not limited to a specific content type. It can be any creative work that allows for the free distribution and modification of the work. The significance of open content can be observed in the success of social networks and content-sharing sites such as Wikimedia Commons...
The MPEG Open Access Application Format is a new standard from MPEG that standardizes the packaging of digital content and associated rights information into an application format to facilitate open access and interoperable exchange. The standard builds on MPEG-21 and MPEG-7 components for file format, package description, metadata, and rights information. This article describes the motivation and composition of the MPEG Open Access Application Format... The MPEG-21 OAC profile is based on the requirements of the Open Access Application Format. The profile consists of a limited set of elements taken from the MPEG-21 REL and several extension elements that support publication of content with the Open Access Application Format. The profile includes support for adaptation, copying, and notifications about copyright and commercial use. The MPEG group created a separate profile for this set of rights expressions to ease the integration of the profile in other applications. One other application using the OAC profile is the MPEG-A standard Media Streaming Application Format. The main goal of the OAC profile [ISO/IEC 21000-5:2004/Amd.3:2008 "Rights Expression Language, Amendment 3: MPEG-21 REL Profiles — The OAC Profile"] is to support open licenses such as the ones from Creative Commons...
The integrity and authenticity of distributed files is critical. Consumers like to know that the content they receive is really the same as what the author published. The Open Access Application Format includes cryptographic signatures conformant to the World Wide Web Consortium XML Signature Syntax and Processing standard to ensure content authenticity and integrity. While the use of signatures is optional, an author might decide to sign the content and metadata and allow the consumer to verify the signature..."
Windows Management Framework: PowerShell 2.0, WinRM 2.0, and BITS 4.0
Greg Duncan, Blog
Microsoft announced the release of the Windows Management Framework, which includes WinRM 2.0 and BITS. WinRM 2.0 is the Microsoft implementation of WS-Management Protocol, a standard Simple Object Access Protocol (SOAP)-based, firewall-friendly protocol that allows for hardware and operating systems from different vendors to interoperate. The WS-Management Protocol specification provides a common way for systems to access and exchange management information across an IT infrastructure...
BITS is a service that transfers files between a client and a server. BITS provides a simple way to reliably and politely transfer files over HTTP or HTTPS. File downloads and file uploads are supported. By default, BITS transfers files in the background, unlike other protocols that transfer files in the foreground. Background transfers use only idle network bandwidth in order to preserve the user's interactive experience with other network applications..."
Wikipedia: "WS-Management is an open standard defining a SOAP-based protocol for the management of servers, devices, applications and more. The specification is based on DMTF open standards and Internet standards for Web Services. WS-Management was originally developed, as many standards are, by a coalition of vendors. The coalition was started with AMD, Dell, Intel, Microsoft, Sun Microsystems and expanded to a total of 13 before being submitted to the DMTF in 2005. The DMTF has standarized a 1.0 version and is currently working on a v1.1. WS-Management provides a common way for systems to access and exchange management information across the IT infrastructure. The specification is quite rich, supporting much more than get/set of simple variables, and in that it is closer to WBEM or Netconf than to SNMP. A mapping of the DMTF-originated Common Information Model into WS-Management was also defined..."
See also: references for WS-Management
Salmon Protocol to Be Built on Top of Atom + Extensions
John Panzer, Posting to 'atom-syntax' List
"Salmon, a proposed open protocol for pushing commentary upstream, is using Atom as its default payload format. It's most likely going to use the following Atom extensions: (1) Atom Threading Extensions, per RFC 4685; (2) Atom Cross-Posting Extensions; (3) The Atom "deleted-entry" Element [tombstones]; (4) PubSubHubbub Core 0.2. Additionally, it will be compatible with and/or serve as transport for Atom Activity Extensions.
From the Salmon Protocol summary: "Conversations are becoming distributed and fragmented on the Web. Content is increasingly syndicated and re-aggregated beyond its original context. Technologies such as RSS, Atom, and PubSubHubbub allow for a real time flow of updates to readers, but this leads to a fragmentation of conversations. The comments, ratings, and annotations increasingly happen at the aggregator and are invisible to the original source. The Salmon Protocol is an open, simple, standards-based solution that lets aggregators and sources unify the conversations. It focuses initially on public conversations around public content...
Salmon is in fact based on and compatible with AtomPub. Salmon greatly enhances interoperability and usability by specifying a distributed identity mechanism for identifying the author and intermediary involved, provides a discovery mechanism, and specifies how things should be linked together. By not requiring pre-registration or federation but still allowing for verifiable identification, it provides a usable, level playing field for all parties involved..."
See also: the Salmon Protocol summary
Why IP is the Right Choice for Smart Grid
Carolyn Duffy Marsan and George Arnold, Network World
"The biggest network upgrade in the United States today is Smart Grid, a multibillion dollar modernization of the electricity grid that involves supporting real-time, two-way digital communications between electric utilities and their increasingly energy-conscious customers. Worth an estimated $20 billion this year, the Smart Grid market has attracted the attention of every major networking vendor, including Cisco, IBM, Microsoft and Google. These vendors are pushing for Smart Grid to adopt common network standards rather than special-purpose protocols. Network World interviewed George Arnold, National Coordinator for Smart Grid Interoperability with the National Institute of Standards and Technology, about where Internet standards and architecture will fit into the Smart Grid strategy...
Arnold: "The Smart Grid is a much more complex infrastructure than the NGN. The development of standards for the full vision of the Smart Grid is going to be a multi-year effort. But in some sense it will be never-ending because the requirements will evolve and the technology will evolve. Where we are today is that the deployment of some aspects of Smart Grid is racing way ahead of the standards. Much of the investment is going into the metering infrastructure. While there are standards, they are pretty loose. There's a tremendous sense of urgency to get them tightened up...
IP and the Internet standards will be a protocol of choice in the Smart Grid. There may be specialized applications where it's not the right fit, and so we're falling short of saying IP has to be used everywhere. Where you have SCADA systems, where you have to have response times in milliseconds, where the requirement is not routing data around a network but the requirement is real-time control of a critical asset, those are cases where a specialized protocol historically has been used and may still have a role... Another benefit of IP is future proofing the Smart Grid. That's very important. The Smart Grid is in some sense like the Internet in that we can hardly image what applications will be enabled through the integration of IT into the grid. So the ability to evolve and have an open architecture in which the industry can find better ways to help customers manage energy is very critical..."
See also: the NIST Smart Grid Interoperability Standards
Sponsors
XML Daily Newslink and Cover Pages sponsored by:
IBM Corporation | http://www.ibm.com |
Microsoft Corporation | http://www.microsoft.com |
Oracle Corporation | http://www.oracle.com |
Primeton | http://www.primeton.com |
Sun Microsystems, Inc. | http://sun.com |
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/