The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: January 21, 2010
XML Daily Newslink. Thursday, 21 January 2010

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com



W3C First Public Working Draft for Contacts API Specification
Richard Tibbett (ed), W3C Technical Report

Members of the W3C Device APIs and Policy Working Group have published a First Public Working Draft for The Contacts API specification. It defines an API that provides access to a user's unified address book.

The API has been designed to meet requirements and use cases specified in the draft. Use cases: (1) Upload a set of contact details to a user's social network; (2) Download a set of contact details from a user's social network; (3) A user would like to keep their work address book and personal address book seperate; (4) A user maintains a single unified address book but would like to maintain groups of contacts within that address book; (5) Use a web interface to manage contact details on both the user's device and the web; (6) A user would like to export contacts from the one address book store and import them to another address book store; (7) A user would like to be notified when friends have a birthday coming up; (8) A user would like his/her contacts to update their own contact details via a mediating Web Application and sync any changes to their current address book.

Details: "The Contacts API defines a high-level interface to provide access to the user's unified contact information, such as names, addresses and other contact information. The API itself is agnostic of any underlying address book sources and data formats... The Contacts interface exposes a database collecting contacts information, such that they may be created, found, read, updated, and deleted. Multiple address books, taken from different sources, can be represented within this unified address book interface...

The programmatic styles of the Contacts API and Geolocation API are very similar and because they both have the the same implied user experience within the same implied User Agent the general security and privacy considerations of both APIs should remain common. The ability to align the security and privacy considerations of the Geolocation API with DAP APIs is important for the potential future benefit of making any security and privacy mechanisms developed within the DAP WG applicable to the Geolocation API at some point in its own ongoing development... A conforming implementation of this specification must provide a mechanism that protects the user's privacy and this mechanism should ensure that no contact information is creatable, retrivable, updateable or removable without the user's express permission...

See also: Google's implementation of Portable Contacts


Labels for Common Location-Based Services
Andrea Forte and Henning Schulzrinne (eds), IETF Internet Draft

Members of the IETF Emergency Context Resolution with Internet Technologies (ECRIT) Working Group have released a new draft for the Standards Track specification Labels for Common Location-Based Services, updating the previous draft of March 23, 2009. The document defines "a registry for describing the types of services available at a specific location. The registry is expected to be referenced by other protocols that need a common set of service terms as protocol constants. In particular, we define location-based service as either a point at a specific geographic location (e.g., bus stop) or a service covering a specific region (e.g., pizza delivery)..."

Details: "Many mobile devices are now equipped to determine the user's geographic location, either through GPS, cell-tower mapping or a network-based triangulation mechanism. Once location information is available, it is natural to want to look up near-by places that provide a specific service, sometimes called points-of-interest (POI). Examples of such services include restaurants, stores, hospitals, automatic teller machines and museums.

to allow such systems to operate across large geographic areas and for multiple languages, it is useful to define a common set of terms, so that the same service is labeled with the same token regardless of who created a particular location service. the number of different labels is clearly potentially very large, but only a relatively small subset of common services is of particular interest to mobile users, such as travelers and commuters. this document focuses on labels commonly found on maps or in navigation devices.

This document creates a registry of service labels and an initial set of values. The registry is protocol-agnostic and should work for all protocols that can handle alphanumeric strings, including "LoST: A Location-to-Service Translation Protocol." LoST XML-based protocol for mapping service identifiers and geodetic or civic location information to service contact URIs. In particular, it can be used to determine the location-appropriate Public Safety Answering Point (PSAP) for emergency services..."

See also: XML and Emergency Management


Earth Observation Application Profile for OGC Catalogue Services
Staff, OGC Announcement

The Open Geospatial Consortium (OGC) has announced adoption and availability of the OGC Earth Observation (EO) Application Profile for the OGC Catalogue Services—(CSW) Specification Version 2.0.2. The EO-CSW standard will benefit a wide range of stakeholders involved in the provision and use of data generated by satellite-borne and aerial radar, optical and atmospheric sensors.

The EO-CSW standard describes a set of interfaces, bindings and encodings that can be implemented in catalog servers that data providers will use to publish collections of descriptive information (metadata) about Earth Observation data and services. Developers can also implement this standard as part of Web clients that enable data users and their applications to very efficiently search and exploit these collections of Earth Observation data and services.

This specification is part of a set that describe services for managing Earth Observation (EO) data products. The services include collection level, and product level catalogues, online-ordering for existing and future products, on-line access etc. These services are put into context in an overall document 'Best Practices for EO Products'. The services proposed are intended to support the identification (EO) data products from previously identified data collections. In other words, the search and present of metadata from catalogues of EO data products.

The intent of the profile is to describe a cost effective interface that can be supported by many data providers (satellite operators, data distributors...), most of whom have existing (and relatively complex) facilities for the management of these data. The strategy is to reuse as far as possible the SOAP binding defined in the ISO Application Profile, except the schemas defining the information model. To achieve a cost effective interface, some choices will be limited by textual comments. EO data product collections are usually structured to describe data products derived from a single sensor onboard a satellite or series of satellites. Products from different classes of sensors usually require specific product metadata. The following classes of products have been identified so far: radar, optical, atmospheric. The proposed approach is to identify a common set of elements grouped in a common (HMA) schema and extend this common schema to add the sensors specific metadata.


U.S. Federal Trade Commission Hosts Roundtable on Consumer Privacy
Staff, FTC Announcement

"The U.S. Federal Trade Commission has released the agenda for its second roundtable on consumer privacy issues scheduled for January 28, 2010. The second roundtable, hosted by the Berkeley Center for Law and Technology, will take place at the University of California, Berkeley, School of Law Booth Auditorium. The roundtable is the second of three public events designed to explore the privacy challenges that are posed by technology and business practices that collect and use consumer data. This second meeting continues the public dialogue by focusing on how technology affects consumer privacy, including its potential to weaken and/or strengthen privacy protections. The roundtable will also explore privacy implications of several evolving technologies, including social networking and other platform services, cloud computing, and mobile computing.

Commissioner Pamela Jones Harbour and Bureau of Consumer Protection Director David Vladeck will kick off the event, and members of industry, consumer groups, academia and government will participate in the five panels. Daniel J. Weitzner, Associate Administrator for Policy, National Telecommunications and Information Administration of the Department of Commerce will provide special remarks.

The Commission also released more information on its third and final privacy roundtable on March 17, 2010. The event will take place at the FTC Conference Center in Washington, DC. The third roundtable will address such issues as how best to protect health data and other sensitive consumer information, and identity management and accountability approaches to privacy. It will also look back at some of the themes raised throughout the series of roundtable events.

The Federal Trade Commission works for the consumer to prevent fraudulent, deceptive, and unfair business practices and to provide information to help spot, stop, and avoid them. The FTC enters Internet, telemarketing, identity theft, and other fraud-related complaints into Consumer Sentinel, a secure, online database available to more than 1,700 civil and criminal law enforcement agencies in the U.S. and abroad..."

See also: the meeting agenda


Call for Participation: W3C Workshop on the Next Steps for RDF
Staff, W3C Announcement

W3C is organizing a Workshop on the Next Steps for RDF around June, 2010 as described in the Call for Participation. The deadline for position papers is 29-March-2010. Each participant in the workshop must be associated with a position paper. W3C membership is not required to participate in the Workshop.

The goal of the workshop is to gather feedback from the Web community on whether and, if yes, in which direction RDF should evolve. One of the main issues the Workshop should help deciding is whether it is timely for W3C to start a new RDF Working Group to define and standardize a next version of RDF.

While a new version of RDF may include changes in terms of features, semantics, and serialization syntax(es), backward compatibility is of a paramount importance. Indeed, RDF has been deployed by tools and applications, and the last few years have seen a significant uptake of Semantic Web technologies and publication of billions of triples stemming from public databases (see, eg, the Linked Open Data community). It would be, therefore, detrimental to this evolution if RDF was seen as unstable and if the validity of current application would be jeopardized by a future evolution. As a consequence, with any changes of RDF, backward compatibility requirements should be formalized..."

Background: "The Resource Description Framework (RDF), including the general concepts, its semantics, and an XML Serialization (RDF/XML), have been published in 2004. Since then, RDF has become the core architectural block of the Semantic Web, with a significant deployment in terms of tools and applications. As a result of the R&D activities and the publication of newer standards like SPARQL, OWL, POWDER, or SKOS, but also due to the large scale deployment and applications, a number of issues regarding RDF came to the fore. Some of those are related to features that are not present in the current version of RDF but which became necessary in practice (e.g., the concept of Named Graphs). Others result from the difficulties caused by the design decisions taken in the course of defining the 2004 version of RDF (e.g., restrictions whereby literals cannot appear as subjects). Definition of newer standards have also revealed difficulties when applying the semantics of RDF (e.g., the exact semantics of blank nodes for RIF and OWL, or the missing connection between URI-s and the RDF resources named by those URI-s for POWDER). New serializations formats (e.g., Turtle) have gained a significant support by the community, while the complications in RDF/XML syntax have created some difficulties in practice as well as in the acceptance of RDF by a larger Web community. Finally, at present there is no standard programming API to manage RDF data; the need may arise to define such a standard either in a general, programming language independent way or for some of the important languages (Javascript/ECMAscript, Java, Python, etc)..."

See also: the Workshop Call for Participation


Principles for Standardized REST Authentication
George Reese, O'Reilly Technical

"Working with the programming APIs for cloud providers and SaaS vendors has taught me two things: (i) There are very few truly RESTful programming APIs. (ii) Everyone feels the need to write a custom authentication protocol. I've programmed against more web services interfaces than I can remember. In the last month alone, I've written to web services APIs for Aria, AWS, enStratus, GoGrid, the Rackspace Cloud, VMOps, Xero, and Zendesk. Each one requires a different authentication mechanism. Two of them (Aria and AWS) defy all logic and require different authentication mechanisms for different parts of their respective APIs. Let's end this here and now...

Here's a set of standards that I think should be in place for any REST authentication scheme. Here's the summary: (1) All REST API calls must take place over HTTPS with a certificate signed by a trusted CA. All clients must validate the certificate before interacting with the server. (2) All REST API calls should occur through dedicated API keys consisting of an identifying component and a shared, private secret. Systems must allow a given customer to have multiple active API keys and de-activate individual keys easily. (3) All REST queries must be authenticated by signing the query parameters sorted in lower-case, alphabetical order using the private credential as the signing token. Signing should occur before URL encoding the query string...

This is a battle I know I am going to lose. After all, people still can't settle on being truly RESTful (just look at the AWS EC2 monstrosity of an API). Authentication is almost certainly a secondary consideration. If you are reading this post and just don't want to listen to my suggestions, I plead with you to follow someone else's example and not roll your own authentication scheme..." Dilip Krishnan (Blog 'RESTful API Authentication Schemes') provides a summary and encourages readers to weigh in on the recommendations.

See also: Dilip Krishnan


W3C Invites Implementations of W3C XSD Component Designators
Mary Holstege and Asir S. Vedamuthu (eds), W3C Technical Report

Members of the W3C XML Schema Working Group now invite implementation of the Candidate Recommendation specification "W3C XML Schema Definition Language (XSD): Component Designators." The Candidate Recommendation review period for this document extends until 1-March-2010. Comments on this document should be made in W3C's public installation of Bugzilla, specifying 'XML Schema' as the product.

A test suite is under development that identifies the set of canonical schema component paths that should be generated for particular test schemas, and that relates certain non-canonical component paths to the corresponding canonical schema component paths. The W3C XML Schema Working Group has agreed on the following specific CR exit criteria: (1) A test suite is available which provides cases for each axis and component type, both for the XML Schema 1.0 component model and the XML Schema 1.1 component model. (2) Generation or interpretation of canonical schema component paths have been implemented successfully by at least two independent implementations. (3) Generation or interpretation of each axis and component for non-canonical schema component paths has been implemented successfully by at least two independent implementations. (4) The Working Group has responded formally to all issues raised against this document during the Candidate Recommendation period.

"XML Schema: Component Designators" defines a scheme for identifying XML Schema components as specified by 'XML Schema Part 1: Structures' and 'XML Schema Part 2: Datatypes'. Part 1 of the W3C XML Schema Definition Language (XSD) recommendation defines these schema components, where Section 2.2 lays out the inventory of schema components into three classes: (a) Primary components: simple and complex type definitions, attribute declarations, and element declarations (b) Secondary components: attribute and model group definitions, identity-constraint definitions, and notation declarations (c) "Helper" components: annotations, model groups, particles, wildcards, and attribute uses In addition there is a master schema component, the schema component representing the schema as a whole..."

See also: the W3C XML Schema Working Group


Windows Domain to Amazon EC2 Single Sign-On Access Solutions
Abel Avram, InfoQueue

David Chappell, the Principal of Chappell & Associates, US, has written a whitepaper proposing several solutions for Single Sign-on (SSO) access to applications deployed on Amazon EC2 from a Windows domain. InfoQ explored these solutions to understand what the benefits and tradeoffs each one presented.

The paper is: Connecting to the Cloud: Providing Single Sign-On to Amazon EC2 Applications from an On-Premises Windows Domain. Excerpt: "Users hate having multiple passwords. Help desks hate multiple passwords too, since users forget them. Even IT operations people hate them, because managing and synchronizing multiple passwords is expensive and problematic. Providing single sign-on (SSO) lets users log in just once, then access many applications without needing to enter more passwords. It can also make organizations more secure by reducing the number of passwords that must be maintained. And for vendors of Software as a Service (SaaS), SSO can make their applications more attractive by letting users access them with less effort...

With the emergence of cloud platforms, new SSO challenges have appeared. For example, Amazon Web Services (AWS) provides the Amazon Elastic Compute Cloud (Amazon EC2). This technology lets a customer create Amazon Machine Images (AMIs) containing an operating system, applications, and more. The customer can then launch instances of those AMIs (virtual machines) to run applications on the Amazon cloud. Similarly, Microsoft provides Windows Azure, which lets customers run Windows applications on Microsoft's cloud. When an application running on a cloud platform needs to be accessed by a user in an on-premises Windows domain, giving that user single sign-on makes sense. Fortunately, there are several ways to do this..."

"SSO is an important feature to have when the number of on-premises and Internet accounts created by users grow to large numbers, making the task of administering them increasingly difficult. This will likely result in more requests to software vendors for SSO support/solutions since these make the users' lives simpler and reduce administration costs..."

See also: the white paper


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2010-01-21.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org