The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: January 07, 2010
XML Daily Newslink. Thursday, 07 January 2010

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com



Updated IETF Draft: Portable Symmetric Key Container (PSKC)
Philip Hoyer, Mingliang Pei, Salah Machani (eds), IETF Internet Draft

Members of the IETF KEYPROV Working Group have released a revised version of the Standards Track Internet Draft Portable Symmetric Key Container (PSKC). Changes from -04 include: updated and corrected examples, updated old reference to URI, adopted and corrected editorial notes from WG Chair (Hannes); no changes were made to the XML schema. The XML namespace URI for Version 1.0 of PSKC is xmlns:pskc="urn:ietf:params:xml:ns:keyprov:pskc. Appendix A presents Use Cases; Appendix B: Requirements, Section 11: XML schema for PSKC.

"With increasing use of symmetric key based systems, such as encryption of data at rest, or systems used for strong authentication, such as those based on one-time-password (OTP) and challenge response (CR) mechanisms, there is a need for vendor interoperability and a standard format for importing and exporting (provisioning) symmetric keys. For instance, traditionally, vendors of authentication servers and service providers have used proprietary formats for importing and exporting these keys into their systems, thus making it hard to use tokens from two different vendors. This document defines a standardized XML-based key container, called Portable Symmetric Key Container (PSKC), for transporting symmetric keys and key related meta data. The document also specifies the information elements that are required when the symmetric key is utilized for specific purposes, such as the initial counter in the MAC-Based One Time Password (HOTP) algorithm. It also requests the creation of an IANA registry for algorithm profiles where algorithms, their meta-data and PSKC transmission profile can be recorded for centralised standardised reference.

This IETF Working Group was chartered "to define protocols and data formats necessary for provisioning of symmetric cryptographic keys and associated attributes. The group is considering use cases related to use of Shared Symmetric Key Tokens. Other use cases may be considered for the purpose of avoiding unnecessary restrictions in the design and ensure the potential for future extensibility... "Current developments in deployment of Shared Symmetric Key (SSK) tokens have highlighted the need for a standard protocol for provisioning symmetric keys. The need for provisioning protocols in PKI architectures has been recognized for some time...

Although the existence and architecture of these protocols provides a feasibility proof for the KEYPROV work assumptions built into these protocols mean that it is not possible to apply them to symmetric key architectures without substantial modification. In particular, the ability to provision symmetric keys and associated attributes dynamically to already issued devices such as cell phones and USB drives is highly desirable. The working group will develop the necessary protocols and data formats required to support provisioning and management of symmetric key authentication tokens, both proprietary and standards based..."

See also: Additional Portable Symmetric Key Container (PSKC) Algorithm Profiles


UN/CEFACT Survey of Standards for eBusiness, Government, and Trade
Tim McGrath, Posting to OASIS UBL TC Discussion List

As Co-Chair of the OASIS Universal Business Language (UBL) Technical Committee, Tim McGrath has posted a renewed request for participation in a UN/CEFACT survey of standards for eBusiness, Government, and Trade.

"There are still two weeks left if you have not completed the survey, so please take a few moments now and help save the world! For several years the United Nations Centre for Trade Facilitation and Electronic Business (CEFACT) has been developing a set of electronic standards for international trade data. Progress has been slower than had been hoped for, and CEFACT is currently reassessing stakeholder needs and priorities to ensure that its work is addressing the most urgent requirements. The UN/CEFACT project teams responsible are consulting those already involved in CEFACT's work as well as a selection of stakeholders that are not, in order to submit a report to CEFACT by the end of February 2010. They would be grateful if you would complete this online survey by 23 January 2010 at the latest.

There is a long standing collaboration to ensure that UBL version 2 converges with effective future standards from UN/CEFACT. This survey is an important means to promote these requirements and ensure the convergence moves in the right direction—a situation we would all benefit from. Therefore I encourage you all to have your say now..."

From the survey text: "CEFACT has formed three project teams to report quickly on: (1) The current needs of stakeholders for open international standards for electronic international trade data messages; and which standards stakeholders would like CEFACT to deliver as priorities; 'stakeholders' answering questions are likely to be those who are users or implementers of message standards. (2) An assessment of the knowledge that stakeholders have in the CEFACT Core Component Library (CCL), which contains the data elements needed to complete messages, and the usability or otherwise of the CCL; 'stakeholders' are likely to be organisations involved in developing standards for their own industry/region and implementers of message standards. (3) The technical specifications that underpin message assembly and assure messages can be interpreted by different systems; 'stakeholders' are likely to be the organisations involved in developing technology infrastructure standards, technology users and implementers of message standards..."

See also: the UN/CEFACT survey link


Secure Sockets Layer (SSL) Protocol Flaw: Why Didn't We Spot That?
Stephen Farrell, IEEE Internet Computing

The Secure Sockets Layer (SSL) protocol and its standards-track successor, the Transport Layer Security (TLS) protocol,1 were developed more than a decade ago and have generally withstood scrutiny in that the protocols themselves haven't been found to have security flaws. Until now. In August 2009, Marsh Ray and Steve Dispensa discovered a design flaw in the TLS protocol that affects all versions of the protocol up to and including the current version.

Whereas the vulnerability itself is serious, it need not affect many deployments once administrators apply suitable patches to disable renegotiation, leaving TLS sufficiently secure in most cases because exploiting the vulnerability requires the attacker to be an active man-in-themiddle, redirecting traffic between victims (for example, a browser and a Web server)... The vulnerability is an interesting attack in itself, but perhaps more interesting is the question, why didn't we see this earlier? In this article, I explore this question but, unfortunately, can't answer it. Hopefully, simply asking the question might prompt developers to re-examine assumptions they've forgotten they've even made.

It might partly be due to a split between those who develop and use security protocols (such as participants in the IETF) and those who analyze security protocols. There are generally few analyses of security protocols presented to IETF participants because its focus is generally on either producing new protocols or fixing known problems in existing ones, as in this case. Although several analyses of TLS have been published in the literature, they mainly seem to focus (as we would expect) on the security of key establishment and how applications subsequently use those keys... There's also the fact that it's often hard for protocol developers to fully understand the assumptions built into the security proofs presented in the literature—for example, typical IETF participants might not properly understand the conclusion that -- the TLS protocol framework securely realizes secure communication sessions, and typical application developers depending on TLS to secure their applications are probably even less well-placed to understand such conclusions.

Implicit in what I've just described is the fact that today's uses of the TLS protocol don't actually use renegotiation for the purposes for which it was initially intended (rekeying or wrapping sequence numbers). Renegotiation to handle rekeying or sequence numbers is quite reasonably something that a TLS implementation could handle transparently. However, because renegotiation ended up being used particularly for transitioning between authentication states that are highly meaningful for applications using TLS, it's now clear that such renegotiation shouldn't be transparent to applications when used like this..... Arguably, protocol developers should pay closer attention to features like this that end up being used for purposes for which they weren't originally intended..."

See also: the IETF Transport Layer Security Working Group status pages


Open Geospatial Consortium Seeks Input on Next Version of CityGML
Staff, OGC Announcement

"The Open Geospatial Consortium, Inc. (OGC) is seeking broad input on enhancements and changes for a revision of the OGC City Geography Markup Language (CityGML) Encoding Standard. CityGML is an open information model and XML-based encoding for the representation, storage, and exchange of virtual 3D city models. CityGML is implemented as an application schema of the OGC Geography Markup Language Version 3 (GML3) Encoding Standard, an international standard for spatial data exchange and encoding approved by the OGC and ISO.

In contrast to other 3D vector formats, CityGML is based on a rich, general purpose semantic model as well as reference system, geometry and graphics content which support sophisticated analysis tasks. Applications include urban and landscape planning; architectural design; tourist and leisure activities; 3D cadastres; environmental simulations; mobile telecommunications; disaster management; homeland security; vehicle, pedestrian, and indoor navigation; training simulators; and mobile robotics.

The current version 1.0 of CityGML was adopted as an official OGC Standard in August 2008 and has come to wide use since then. The OGC Technical Committee seeks input from the wider community in the form of change requests, proposed additions and suggestions for the future development of CityGML. These should be submitted by February 26, 2010.

The next version of CityGML will be a minor revision of the standard to version 1.1. The anticipated release date will not be before autumn 2010. The main objective of this revision is to keep backwards compatibility with the current version 1.0 of CityGML. Change and feature requests resulting in a loss of backwards compatibility are welcome but will be postponed to a major revision of CityGML to version 2.0..."

["The OpenGIS Geography Markup Language Encoding Standard (GML) The Geography Markup Language (GML) is an XML grammar for expressing geographical features. GML serves as a modeling language for geographic systems as well as an open interchange format for geographic transactions on the Internet. As with most XML based grammars, there are two parts to the grammar — the schema that describes the document and the instance document that contains the actual data. A GML document is described using a GML Schema. This allows users and developers to describe generic geographic data sets that contain points, lines and polygons. However, the developers of GML envision communities working to define community-specific application schemas that are specialized extensions of GML. Using application schemas, users can refer to roads, highways, and bridges instead of points, lines and polygons. If everyone in a community agrees to use the same schemas they can exchange data easily and be sure that a road is still a road when they view it..."]

See also: Geography Markup Language (GML)


Internet Predictions: Future Imperfect
Vincent Cerf, IEEE Internet Computing

"As the second decade of the 21st century dawns, predictions of global Internet digital transmissions reach as high as 667 exabytes (1018 bytes) per year by 2013. Based on this prediction, traffic levels might easily exceed many zettabytes (1021 bytes, or 1,000 exabytes) by the end of the decade. Setting aside the challenge of somehow transporting all that traffic and wondering about the sources and sinks of it all, we might also focus on the nature of the information being transferred, how it's encoded, whether it's stored for future use, and whether it will always be possible to interpret as intended...

If storage technology continues to increase in density and decrease in cost per Mbyte, we might anticipate consumer storage costs dropping by at least a factor of 100 in the next 10 years, suggesting petabyte disk drives costing between $100 and $1,000... As larger-scale systems are contemplated, operational costs, including housing, electricity, operators, and the like, contribute increasing percentages to the annual cost of maintaining large-scale storage systems. The point of these observations is simply that it will be both possible and likely that the amount of digital content stored by 2010 will be extremely large, integrating over government, enterprise, and consumer storage systems. The question this article addresses is whether we'll be able to persistently and reliably retrieve and interpret the vast quantities of digital material stored away in various places.

Standards in general play a major role in helping reduce the number of distinct formats that might require support, but even these standards evolve with time, and transformations from older to newer ones might not always be feasible or easily implemented. The World Wide Web application on the Internet uses HTML to describe Web page layouts. The W3C is just reaching closure on its HTML5 specification. Browsers have had to adapt to interpreting older and newer formats. XML is a data description language. High-level language text embedded in Web pages adds to the mix of conventions that need to be supported. Anyone exploring this space will find hundreds if not thousands of formats in use..."

See also: the Guest Editors' Introduction


CES 2010: Ford Promises a Smarter Digital Dashboard
Larry Barrett, Datamation

"During his keynote address, Ford CEO Alan Mulally sounded more like the chief of a software company than an automaker, telling attendees at the Consumer Electronics Show that Ford will continue to drive innovation through embedded applications running on its Microsoft-powered Sync software...

During his address, Mulally noted the increased acceptance and popularity of Ford's Sync system has not only resulted in a more informed and engaged driver, but actually helped differentiate from its competitors, domestic and foreign alike... He said 32 percent of respondents to a recent Ford study said was either a "critical" or "important" factor in determining whether or not to buy a Ford model. Eighty-one percent of heavy users reported they were satisfied with the system and 77 percent said they would recommend the system and, most important, the car to their friends and family.

Ford will roll out a new development program dubbed American Journey 2.0 for university students to take all this data collected from the onboard computing system and combine with other data in the cloud to create new "relevant" features and applications... For example, Mulally said, if everyone on a particular road begins to turn on their fog lamps and windshield wipers, that data could be shared through a social networking site like to Twitter to let people know it's raining on a particular stretch of a freeway. Mulally also said Ford will unveil a new software developer's kit this year for strategic partners that will eventually be shared with the developer community to dream up the next generation of auto-specific Web 2.0 applications..."


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2010-01-07.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org