The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: February 26, 2008
XML Daily Newslink. Tuesday, 26 February 2008

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
BEA Systems, Inc. http://www.bea.com



IPTC Announces NewsML-G2 and EventsML-G2 as G2-Standards
Staff, International Press Telecommunications Council Announcement

Misha Wolf (Reuters) posted an IPTC announcement about the launch of NewsML-G2 and EventsML-G2 as the first parts of a new framework of XML-based news exchange formats from the International Press Telecommunications Council (IPTC). NewsML-G2 defines a string-derived datatype called QCode (Qualified Code), which looks like this: "CodingSchemeAlias:Code." The CodingSchemeAlias maps to an IRI representing the CodingScheme. The IRI obtained by appending the Code to this IRI represents the Code. The Code can contain (and start with) most characters. The main exception is white space, and the Code can be entirely numeric. QCodes are used as attribute values. Such attributes accept QCodes only, so there is no conflict with IRIs/URIs. The next steps include the creation of an OWL representation of the NewsML-G2 Schema and Semantics, the translation into SKOS of NewsML-G2 KnowledgeItems, and the updating of our GRDDL transform to reflect the released version of NewsML-G2. Acording to the announcement: "NewsML-G2 allows the bundling of multiple news items—articles, photos, videos or whatever—and a detailed description of their content and how the items relate to each other. Whether populating a web site with complex news packages or building bundles of news items for resale or archiving, NewsML-G2 provides an easy way to package and exchange news... The G2-Standards also fit into the Semantic Web initiatives of the World Wide Web Consortium, enriching content so that computers can more easily search the huge universe of news. The goal is to better help news agencies manage and distribute their massive libraries of current and archived news content, and to help customer search engines find content quickly and accurately. G2-Standards can be easily combined with IPTC's groundbreaking NewsCodes, which provide a rich suite of standard terms for describing news, to give news agencies amazing flexibility in how news can be bundled for downstream users. With widely available digital news archives now dating back to 1850 or earlier, news agencies, librarians and archivists have a special interest in the rapid searching and retrieval of news, which NewsCodes can accelerate to help drive revenue growth." IPTC is a consortium of the world's major news agencies, news publishers and news industry vendors. It develops and maintains technical standards for improved news exchange that are used by virtually every major news organization in the world.


SOA Spending Up Despite Unclear Benefits
Galen Gruman, InfoWorld

The number of companies investing in service-oriented architecture (SOA) has doubled over the past year in every part of the world, with a typical annual spend of nearly $1.4 million, according to a new research report from the analyst firm AMR Research that surveyed 405 companies in the U.S., Germany, and China. Now the bad news: "Hundreds of millions of dollars will be invested pursuing these markets in 2008, much of it wasted," said AMR analyst Ian Finley. The AMR survey found that most companies don't really know why they are investing in SOA, which Finley said makes long-term commitment iffy. Often, there are multiple reasons cited within any organization, letting SOA appear as a buzzword justification for unrelated individual priorities. "People more easily rally around a thing rather than five things... that lack of a rallying purpose for SOA calls its momentum into question." Finley is concerned that SOA may not get picked up much beyond the early adopters—mainly financial services, telecommunications, and government organizations that are more often than not predisposed to the value of architecture and thus more willing to pursue SOA for less-quantifiable benefits -- unless a coherent set of benefits is made clear. Another danger seen from the SOA survey is that the main benefit that the vendors sell around SOA (code reuse) is not the real benefit that early SOA adopters have gotten. Often the code from project A is irrelevant to project B, he noted. That focus on reuse can cause organizations to dismiss SOA's benefits because they're looking at the wrong metric.


W3C Offices Program Celebrates Ten Years of International Outreach
Staff, W3C Announcement

W3C announced that representatives from W3C Offices—regional branches that promote W3C and interact with participants in local languages -- now celebrate ten years of the Offices program. Offices currently represent seventeen (17) regions around the globe, helping to organize meetings, recruit Members, translate materials, and find creative ways to encourage international participation in W3C work. Offices staff gather for a face-to-face meeting in Sophia-Antipolis France to review ten years of experience and to forge improvements to the program. At this occasion, W3C thanks the Offices staff past and present for all of their work. W3C Offices are located in Australia, Brazil, Benelux, China, Finland, Germany & Austria, Greece, Hungary, India, Israel, Italy, Korea, Morocco, Southern Africa, Spain, Sweden, United Kingdom and Ireland

See also: the Offices


New Book: Understanding Windows CardSpace
Kim Cameron, Book Review

"There is a really wonderful new book out on digital identity and Information Cards called "Understanding Windows CardSpace". Written by Vittorio Bertocci, Garrett Serack and Caleb Baker, all of whom were part of the original CardSpace project, the book is deeply grounded in the theory and technology that came out of it... The presentation begins with a problem statement: 'The Advent of Profitable Digital Crime'. There is a systematic introduction to the full panoply of attack vectors we need to withstand, and the book convincingly explains why we need an in-depth solution, not another band-aid leading to some new vulnerability. For those unskilled in the art, there is an introduction to relevant cryptographic concepts, and an explanation of how both certificates and HTTPS work. These will be helpful to many who would otherwise find parts of the book out of reach. Next comes an intelligent discussion of the Laws of Identity, the multi-centered world and the identity metasystem. The book is laid out to include clever sidebars and commentaries, and becomes progressively more McLuhanesque. On to SOAP and Web Services protocols—even an introduction to SAML and WS-Trust, always with plenty of diagrams and explanations of the threats. Then we are introduced to the concept of an identity selector and the model of user-centric interaction. Part two deals specifically with CardSpace, starting with walk-throughs, and leading to implementation. This includes 'Guidance for a Relying Party', an in-depth look at the features of CardSpace, and a discussion of using CardSpace in the browser. The authors move on to Using CardSpace for Federation, and explore how CardSpace works with the Windows Communication Foundation. Even here, we're brought back to the issues involved in relying on an Identity Provider, and a discussion of potential business models for various metasystem actors..."

See also: the book details


Beta Release: ID-WSF 2.0 Web Services Client Library (ClientLib)
Staff, openLiberty.org Announcement

Asa Hardcastle, OpenLiberty Technical Lead, has announced the beta release of the ID-WSF 2.0 ClientLib application. openLiberty.org was established to provide easy access to tools and information to jump start the development of more secure and privacy-respecting identity-based applications based on Liberty Alliance standards. The first project at openLiberty.org is the ID-WSF WSC Client Library ("ClientLib") that will help you more easily build and deploy a wide range of new relying party (identity-consuming) applications. The ClientLib uses OpenSAML's Java XML Tooling, SOAP, and SAML2 Libraries. As announced: "As of February 25th 2008 the ClientLib is officially released as BETA code. Over the next few months we'll be writing more code and doing some interoperability testing. The ClientLib includes support for ID-WSF Authentication Service (PLAIN and CRAM-MD5), Discovery Service, a non-standard Profile Service, and Directory Access Protocol Service (ID-DAP). Both signed and unsigned messaging is supported. The Data Services Template (DST 2.1) is mostly complete. The DST 2.1 reference implementation is mostly complete. People Service is partially complete." From Asa's blog entry: "This release marks excellent progress, but there is still a lot of work to do. The beta is not bug free nor is it thoroughly tested. It is ready for other people to sink their teeth into and give feedback, make requests, or write some code. For development purposes we are currently testing against two ID-WSF WSPs and have access to a third (HP Select Federation) which we hope to have working with the library before Version 1 release planned later this year."

See also: Asa Hardcastle's blog


Why Liberty's Identity Governance Framework is So Important
Felix Gaehtgens, Blog

In late 2006, several companies got together and created the Identity Governance Framework (IGF), an initiative of the Liberty alliance. The purpose of the IGF is to provide an open architecture that addresses governance of identity related information. This architecture is meant to bridge the gap between regulatory requirements and the lower-level protocols and architecture. How can the inherent risks associated with the creation, copying, maintenance and use of identity data be mitigated? Who has access to what data for which purpose, and under what conditions? Ideally, policies on data usage are created by sources (attribute authorities) and consumers (attribute authorities) of identity data. These policies can then then be used for the implementation and auditing of governance. In other words: if you know what the rules are, express them in a policy, and make sure your policy is watertight when the next audit comes. Exactly this is what the IGF attempts to create: a standardised mechanism for expression and implementation of these policies. The IGF is working on several standards and components to make this happen. One of them is the CARML (Client Attribute Request Markup Language) protocol. It defines application identity requirements, in other words what type of identity information an application needs, and what that application will do with that information. On the other side of the spectrum there is AAPML ('Attribute Authority Policy Markup Language') that describes the constraints on the use of the provided identity information—under what conditions specific pieces of identity data is made available to applications, and how this data may be used, and possibly modified. For example: what part of the users data can be modified by the users directly at a self-service portal? Or: under which condition may a marketing application use a users data, and what type of explicit consent needs to be given by the user? AAPML is proposed as a profile of XACML, so that AAPML policies can be consumed directly by a policy enforcement point (PEP) to enforce access over the requests for identity data... CARML and AAPML bridge a very important gap that is not addressed anywhere else: not how to request and receive attributes, but to express the need and purpose of identity data, and on the other side the allowed use and conditions for its consumption. IGF's framework conceptually fits seamlessly into architectures harnessing today's frameworks and picks up where CardSpace, Higgins, Bandit and WS-Trust, leave off.

See also: the IGF introduction


PRESTO: A WWW Information Architecture for Legislation and Public Information Systems
Rick Jelliffe, O'Reilly Articles

PRESTO (P - Public, Permanent URLs; REST - Representation, State Transfer; O - Object-oriented) is not something new: its basic ideas are presupposed in a lot of people's thinking about the web, and many people have given names to various parts. The elevator pitch for PRESTO is this: "All documents, views and metadata at all significant levels of granularity and composition should be available in the best formats practical from their own permanent hierarchical URIs." I would see PRESTO as the kind of methodology that a government could adopt as a whole-of-government approach, in particular for public documents and of these in particular for legislation and regulations. The problem is not 'what is the optimal format for our documents?' The question is 'How can link to the important grains of information in a robust, technology-neutral way that only needs today's COTS tools?' The format wars, in this area, are asking exactly the wrong question: they focus us on the details of format A rather than format B, when we need to be able to name and link to information regardless of its format: supra-notational data addressing. If you are wanting to build a large information system for the kinds of documents, and you want to be truly vendor neutral (which is not the same thing as saying that preferences and delivery-capabilities will not still play their part), and you want to encourage incremental, decentralized ad hoc and planned developments in particular mash-ups, then you need Permanent URLs (to prevent link rot), you need REST (for scale etc) and you need object-oriented (in the sense of bundling the methods for an object with the object itself, rather than having separate verb-based web services which implement a functional programming approach: OO here also including introspection so that when you have a resource you can query it to find the various operations available). A rule of thumb for a document system that conformed to this PRESTO approach would be that none of the URLs use '#' (which indicates that you are groping for information inside a system-dependent level of granularity rather than being system-neutral) or '?' (which indicates that you are not treating every object you can think about as a resource in its own right that may itself have metadata and children.)

See also: the PRESTO description


Liberty Alliance Announces Health Identity Management SIG
Staff, Liberty Alliance Announcement

Liberty Alliance, the global identity consortium working to build a more trusted internet for consumers, governments and businesses worldwide, has announced the launch of a global public forum formed to develop an interoperable, secure and privacy-respecting information exchange system for the healthcare sector. The Liberty Alliance Health Identity Management Special Interest Group (HIM SIG) is leveraging the Liberty Alliance model of addressing the technology, business and privacy aspects of digital identity management to meet the unique identity management and regulatory challenges facing the international healthcare industry today. The Health Identity Management SIG offers members an opportunity to join with other Liberty Alliance Members (regardless of membership level) to recommend standards to enable an internationally interoperable health care identity management and information exchange system. This may includes standard directory (LDAP) models, health care roles, implementation guides, and similar recommendations. The SIG will review existing standards, and recommend new standards for an interoperable health care identity management system using Security Assertion Markup Language (SAML) and Liberty Specifications. Co-chaired by John Fraser, CEO, MEDNET USA and Pete Palmer, Security and Cryptography Architect, Wells Fargo, the HIM SIG currently includes over 30 members from around the world representing the education, government, healthcare and technology sectors. Members are working to address how the healthcare industry will deliver secure identity management solutions that meet global regulatory mandates and ensure patient privacy. The public group is working closely with the Liberty Identity Assurance Expert Group to ensure requirements for standardized and certified identity assurance levels in the healthcare sector meet criteria established in the policy-based Liberty Identity Assurance Framework.

See also: Liberty Alliance references


Holder-of-Key Web Browser SSO Profile
Nathan Klingenstein (ed), OASIS SSTC Contribution

"As part of my work for the National Institute of Informatics and the UPKI initiative, I've been working on a modified Web Browser SSO profile for SAML 2.0 that uses holder-of-key confirmation for the client rather than bearer authentication. The keys for this confirmation are supplied through TLS using client certificates. This results in a more secure sign-on process and, particularly, a more secure resulting session at the SP. There is no need for the SP to do PKI validation or know anything about the client certificate itself. The specification supplies an alternative to "Profiles for the OASIS Security Assertion Markup Language (SAML) V2.0." Excerpt: "The profile allows for transport and validation of holder-of-key assertions by standard HTTP user agents with no modification of client software and maximum compatibility with existing deployments. Most of the flows are as in standard Web Browser SSO, but an x.509 certificate presented by the user agent supplies a valid keypair through client TLS authentication for HTTP transactions. Cryptographic data resulting from TLS authentication is used for holder-of-key validation of a SAML assertion. This strengthens the assurance of the resulting authentication context and protects against credential theft, giving the service provider fresh authentication and attribute information without requiring it to perform successful validation of the certificate... A principal uses an HTTP user agent to either access a web-based resource at a service provider or access an identity provider such that the service provider and desired resource are understood or implicit. In either case, the user agent needs to acquire a SAML assertion from the identity provider. The user agent makes a request to the identity provider using client TLS authentication. The X.509 certificate supplied in this transaction is used primarily to supply a public key that is associated with the principal. The identity provider authenticates the principal by way of this TLS authentication or any other method of its choice. The identity provider then produces a response containing at least an assertion with holder-of-key subject confirmation and an authentication statement for the user agent to transport to the service provider. This assertion is presented by the user agent to the service provider over client TLS authentication to prove possession of the private key matching the holder-of-key confirmation in the assertion. The service provider should rely on no information from the certificate beyond the key; instead, it consumes the assertion to create a security context. The TLS key may then be used to persist the security context rather than a cookie or other application-layer session. To implement this scenario, a profile of the SAML Authentication Request protocol is used in conjunction with the HTTP Redirect, HTTP POST and HTTP Artifact bindings. It is assumed that the user is using an HTTP user agent capable of presenting client certificates during TLS session establishment, such as a standard web browser...


Web Services: RPC, REST, and Messaging
Paul Done, Blog

How to choose a model for interoperable communication in the enterprise? For the implementation of Web Services in the enterprise environment, I've seen many different technologies used. Recently, in my spare moments, I've reflected on this and have come to the conclusion that all these technologies tend to fit one of three models (or hybrids of these models). I would summarise these three models as: (1) Remote Procedure Calls (RPC). A client-server based remotable pattern where a subset of an existing system's local functions is exposed pretty much 'as-is' over the wire to client programs. (2) Resource-oriented Create-Read-Update-Delete (CRUD). A client-server based resource-oriented pattern where the server-side provides a representation of a set of resources (often hierarchical) and exposes Create, Read, Update and Delete capabilities for these resources to client programs. (3) Messaging (e.g., as commonly seen with Message Oriented Middleware and B2B). Messages or documents are passed asynchronously between peer systems in either, but not always both, directions. Sometimes its hard to distinguish between these models and where the boundaries lie. In fact, I don't think there are boundaries, only grey areas and all three models lie in the same spectrum. In the Web Services world, we may typically implement these three models using one of the following three approaches: (1') Remote Procedure Calls: SOAP using a synchronous RPC programming approach and, typically, generated 'skeletons/stubs' and some sort of Object-to-XML marshalling technology. (2') Resource-oriented Create-Read-Update-Delete: REST or 'RESTful Web Services' or ROA, re-using World-Wide-Web based approaches and standards like HTTP and URIs. (3') Messaging: SOAP using an asynchronous Message/Document passing approach where invariably the documents are defined by schemas and, often, the use of message-level (rather than transport-level) security elements is required... When faced with the REST zealot or the WS-* zealot, we probably need to bear this spectrum in mind. For the Web Services paradigm, there is not a 'one-size fits all' and specific requirements for a given situation should dictate which position in this spectrum best lends itself to satisfying the requirements. Also, the overlap between the models may be greater [than shown in the diagram]. For example, some would argue that REST can happily and more appropriately be used to fulfil what would otherwise be RPC oriented problems, in addition to solving Resource-oriented CRUD style problems.


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
EDShttp://www.eds.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2008-02-26.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org