The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: August 18, 2009
XML Daily Newslink. Tuesday, 18 August 2009

A Cover Pages Publication
Provided by OASIS and Sponsor Members
Edited by Robin Cover

This issue of XML Daily Newslink is sponsored by:
Oracle Corporation

Public Review Drafts from the OASIS SCA-Assembly Technical Committee
Staff, OASIS Announcement

Members of the OASIS Service Component Architecture / Assembly (SCA-Assembly) TC have released two Committee Draft specifications for public review through October 17, 2009. The "Test Assertions for the SCA Assembly Model Version 1.1 Specification" defines the Test Assertions for the SCA Assembly specification. The Test Assertions represent the testable items relating to the normative statements made in the SCA Assembly specification. The Test Assertions provide a bridge between the normative statements in the specification and the conformance TestCases which are designed to check that an SCA runtime conforms to the requirements of the specification.

The "Test Cases for the SCA Assembly Model Version 1.1 Specification" defines TestCases related to the Test Assertions described in the SCA Assembly Test Assertions document. The SCA Assembly testcases follow a standard structure. They are divided into two main parts: (1) Test Client, which drives the test and checks that the results are as expected; (2) Test Application, which forms the bulk of the testcase and which consists of Composites, WSDL files, XSDs and code artifacts such as Java classes, organized into a series of SCA contributions. The basic idea is that the Test Application runs on the SCA runtime that is under test, while the Test Client runs as a standalone application, invoking the Test Application through one or more service interfaces.

The test client is designed as a standalone application. The version built here is a Java application which uses the JUnit test framework, although in principle, the client could be built using another implementation technology. The test client is structured to contain configuration information about the testcase, which consists of: (1) metadata identifying the Test Application in terms of the SCA Contributions that are used and the Composites that must be deployed and run; (2) data indicating which service operation(s) must be invoked with input data and expected output data, including exceptions for expected failure cases...

See also: the OASIS announcement

Incubator Group Report: Emergency Information Interoperability Framework
Chamindra de Silva and Renato Iannella (eds), W3C Final Report

W3C announced that the Emergency Information Interoperability Framework Incubator Group has published its final report. The group also published "Emergency Information Interoperability Frameworks," which describes some critical requirements for an interoperability information framework for emergency management, and provides candidate components of an ontology that can support interoperability for some common use cases.

"The approach discussed outlines how one can achieve information interoperability across the stakeholder functions within the area of emergency management. The group recommends that W3C initiate an Interest Group to continue the work of the Incubator Group and expand the outreach to standards development through partnerships with professional communities and interoperability workshops. This publication is part of the Incubator Activity, a forum where W3C Members can innovate and experiment...."

From the Report Executive Summary: "The EIIF XG has demonstrated in its period of activity the large scope of the standardisation effort required in the emergency management / humanitarian response domain. To evaluate feasibility, the group worked on and piloted concepts on the specific use case of "Who is doing What Where," which is a common information coordination pattern in this domain. The main results of this effort, as committed to in the goals of the EIIF XG, are the emergency management information standards review, in draft form, and a framework document surrounding the use case (ontology, scenario definition, standards gaps), in final form. However the group also delivered many valuable by-products, including a review of emergency management systems, and references to popular glossaries/control vocabularies and prevalent regulations in the domain..."

See also: XML and Emergency Management

Page Models and Geometries of ODF, IDML, and XFL
Rick Jelliffe, O'Reilly Technical

"Every document system for printing provides mechanism for specifying page sizes, margins, page templates and so on. Any graphics based system also has a coordinates system, which specify the origin and units used for points, lines and frames relative to the page... This blog entry looks at the page models and geometries of two current XML-in-ZIP publishing formats for text documents: ODF and IDML. The ODF style should be pretty familiar, but the IDML style might be a little surprising...

When thinking about page models, there are four main design choices, relating to whether the major text areas need to be explicitly declared (or is there an idea of a default text area in the margins of the page) and whether text blocks can be connected automatically to each—so that overflow text goes into another box, and therefore an overflow can even trigger the generation of a new page...

It has struck me that often people coming to XML from the HTML or database side rather than the publishing or graphics side have not had much exposure to some of the issues of page models, geometries, units and so on. Calculation issues are obviously important for graphics, but publishing and printing are not immune: bleed through problems for example..."

NIST Publishes Cryptographic Key Management Workshop Summary
Elaine Barker, Dennis Branstad, Santosh Chokhani, Miles Smid; NIST Draft Report

NIST announced that the Draft NIST Interagency Report 7609 "Cryptographic Key Management Workshop Summary (NIST IR-7609" from the June 8-9, 2009 event is available for public comment. The Cryptographic Key Management (CKM) workshop was initiated by the NIST Computer Security Division to identify and develop technologies that would allow organizations to leap ahead of normal development lifecycles to vastly improve the security of future sensitive and valuable computer applications. The workshop was the first step in developing a CKM framework.

The Cryptographic Key Management Workshop Summary document provides the highlights of the presentations, organized by both topic and by presenter. The intended audience of the document includes individuals and organizations seeking to better understand cryptographic key management, with an emphasis on those planning to design, procure, or use a secure CKM system. A CKM Framework is under development that will describe, in a logical structure, the components of a CKM system and the characteristics that make them useful in describing, designing, and operating a CKM system...

Key management has been identified as a major component of national cybersecurity initiatives that address the protection of information processing applications. Numerous problems have been identified in current key management methodologies, including the lack of guidance, inadequate scalability of the methods used to distribute keys and user dissatisfaction because of the unfriendliness of these methods. The workshop sought to identify the inadequacies of current key management methodologies and to plan for a transition to more useful and appropriate key management methods..."

See also: Cryptographic Key Management references

AMEE: Embed Environmental Intelligence In Your Applications
James Smith, IBM developerWorks

"Today, there is a great deal of interest in energy, and its less-desirable environmental shadow, carbon dioxide. To create a more sustainable world, individuals, companies, and governments are focusing attention on energy and how we use it. The route to understanding our usage of energy (and therefore carbon) is to measure and analyze it, to understand the results, and then act on that information. AMEE is a neutral aggregation platform for all forms of energy and activity data, and associated carbon models.

The AMEE Web-based API allows users to store and retrieve many forms of consumption data over long periods, while simultaneously applying recognised carbon calculation models to determine the environmental consequences of that consumption. It has been used by developers to deliver energy tracking and management applications ranging from the UK Government's Act On CO2 calculator, CNN, and Google applications and to power carbon management applications such as Carbonetworks and Misys OpenCarbonWorld...

Interaction with the AMEE platform is accomplished though a RESTful HTTP API which offers a choice of XML, JSON, or Atom data formats. SOAP and other APIs will be added in the future. The API consists of two parts: profiles and data. Profiles are where your energy data is stored. By storing this data over time, you can store a history of energy use for your clients, your business, or yourself. The data store provides a huge array of standard models, including the GHG Protocol, SAP building assessment, carbon emissions factors for 150 countries and regions, and related methodologies..."

See also: the AMEE web site

Extensible Messaging and Presence Protocol (XMPP): Instant Messaging and Presence
Peter Saint-Andre (ed), IETF Internet Draft

Members of the IETF Extensible Messaging and Presence Protocol (XMPP) Working Group have released updated Internet Draft specifications for Extensible Messaging and Presence Protocol (XMPP): Instant Messaging and Presence (xmpp-3921bis) and Extensible Messaging and Presence Protocol (XMPP) Core (xmpp-3921bis).

XMPP is an technology for the near-real-time exchange of messages and presence notifications, where data is exchanged over Extensible Markup Language (XML) streams. This IETF Working Group was chartered to provide errata, clarifications, and suggestions for improvement to the core XMPP specifications in RFCs 3920 and 3921.

"XMPP is typically used to exchange messages, share presence information, and engage in structured request-response interactions. The core features of XMPP defined in XMPP Core provide the building blocks for many types of near-real-time applications, which can be layered on top of the core by sending application-specific data qualified by particular XML namespaces. The xmpp-3921bis document defines XMPP extensions that provide the basic functionality expected of an instant messaging (IM) and presence application as defined in the published requirements. As a result of extensive implementation and deployment experience with XMPP since 2004, as well as more formal interoperability testing carried out under the auspices of the XMPP Standards Foundation (XSF), this document reflects consensus from the XMPP developer community regarding XMPP's basic instant messaging and presence features..."

See also: Extensible Messaging and Presence Protocol (XMPP) Core


XML Daily Newslink and Cover Pages sponsored by:

IBM Corporation
Microsoft Corporation
Oracle Corporation
Sun Microsystems, Inc.

XML Daily Newslink:
Newsletter Archive:
Newsletter subscribe:
Newsletter unsubscribe:
Newsletter help:
Cover Pages:

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: