The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: October 07, 2008
XML Daily Newslink. Tuesday, 07 October 2008

A Cover Pages Publication
Provided by OASIS and Sponsor Members
Edited by Robin Cover

This issue of XML Daily Newslink is sponsored by:
Sun Microsystems, Inc.

Visual Orchestration Tool for Service-Based Applications
John K. Waters, Application Development Trends

Will the concept of a visual orchestration system usher in a new wave of services-based applications? That's what Active Endpoints hopes. The Waltham, Mass.-based makers of ActiveVOS have been sounding this message with what approaches messianic fervor for the past eight months. With the recent release of the 6.0 version of the product, industry analysts are listening. ActiveVOS 6.0 is an all-in-one orchestration and business process management system designed to allow Java developers, business analysts and enterprise architects to automate business processes. What makes the system a standout is its emphasis on collaboration across IT and business boundaries. Dennis Byron pointed out the company was an early mover in the commercialization of open source development known for and its contributions to the Business Process Execution Language (BPEL) standard. It's also known for developing standards-based products that provide business process management (BPM) functionality... ActiveVOS 6.0 offers what the company calls a "stark contrast to SOA stack products," which requires the user to assemble "a complicated puzzle from piece parts before any applications can be built," said company CEO Mark Taber... Version 6.0 comes with a nice feature for Java jocks: It's designed to allow developers to reuse plain old Java objects (POJOs) as native Web services. According to the company, "processes can be thoroughly tested and simulated in ActiveVOS 6.0 even when there are no actual services available during the testing phase." There's also an emphasis in this version on flexible deployment options that allow processes to be versioned and policies dictating what the system should do when a service is unavailable in production to be specified... It's also worth noting that ActiveVOS 6.0 is 100 percent compliant with all major open standards. Along with BPEL, the company lists several standards with which its product is compliant, including: Business Process Modeling Notation (BPMN), BPEL4People, and WS-Human Task specifications. ActiveVOS 6.0 is available now. A free, supported 30-day trial version is available for download.

See also: ActiveVOS 6.0 standards support

FEMA Says 'Yes' to CAP 1.1
Randy J. Stine, Radio World Online

As part of an announcement of intent in July to integrate Common Alerting Protocol (CAP) 1.1 as the standard for the Integrated Public Alert and Warning System, FEMA reiterated that all participants in the next generation of the Emergency Alert System would need to be in compliance with CAP 1.1 within 180 days of CAP's adoption. That adoption—which hasn't happened yet but is expected during the first quarter of 2009 -- has some observers concerned about potential EAS equipment shortages. With CAP, warning messages can be disseminated simultaneously over interoperable warning systems developed by state and local emergency managers. In addition to audio, multimedia such as video, digital photos and text could be used. Some states use CAP for emergency warning now. Some within the emergency management arena, while pleased with FEMA's decision, question whether manufacturers of equipment will be able to withstand a crushing demand for new decoder boxes and meet the 180 day compliance mandate if every EAS participant is indeed required to have a CAP-capable decoder. FEMA says that arriving at standards and protocols that work for everyone is a complex process that includes partners across government and the private sector... The exact wording of FEMA's July 2008 announcement has Art Botterell, one of the architects of CAP 1.1, skeptical about the government agency's intent. Botterell also speculates that any decisions made by FEMA could be "up for grabs" again when a new administration takes over in Washington, with possible leadership changes at the DHS and FEMA at the beginning of the year. FEMA's July announcement did not address the issue of the funding and training that likely will be needed for emergency mangers to originate CAP messages for next-gen EAS properly...

See also: the EAS-CAP Profile Recommendation

Mozilla Geode: Always Know Where You Are
Staff, Mozilla Labs Announcement

"You've arrived in a new city, a new continent, a new coffee shop. You don't really know where you are, and are looking for a good place to eat. You pull out your laptop, fire up Firefox, and go to your favorite review site. It automatically deduces your location, and serves up some delicious suggestions a couple blocks away and plots directions there. In order for this to be a possibility, your browser needs to know where you are. To do this, future versions of Firefox plan on supporting the new W3C Geolocation Specification, which adds the native ability for Web sites to request, and you to optionally grant access to, your location. Geode is an experimental add-on to explore geolocation in Firefox 3 ahead of the implementation of geolocation in a future product release. Geode provides an early implementation of the W3C Geolocation specification so that developers can begin experimenting with enabling location-aware experiences using Firefox 3 today, and users can tell us what they think of the experience it provides. It includes a single experimental geolocation service provider so that any computer with WiFi can get accurate positioning data. The potential here is for more than just resturant lookups. For example, imagine an RSS reader that knows the difference between home and work and automatically changes it's behavior appropriately. Or a news site whose local section is, in fact, actually local. Or Web site authentication that only allows you to login from certain physical locations, like your house... With Geode when a web site requests your location a notification bar will ask how much information you want to give that site: your exact location, your neighborhood, your city, or nothing at all... We're using Skyhook's Loki technology to map the Wifi signals in your area to your location. Unlike normal GPS-based methods which can take upwards of 45 seconds for a lock, Geode works both inside and outside with an accuracy of between 10 to 20 meters, normally within a second. Please note that in this early implementation, both location and IP information is sent to the current provider, Skyhook, everytime a website is granted access to your location. Skyhook's privacy policy is that they do not store or use any personal identifying information, and they promise to only keep data in anonymized agregate. The ultimate plan for Firefox is that service providers and geolocation methods will be pluggable and user selectable—to provide users with as many choices and privacy options as possible. As an experiment, Geode is also the beginning of a conversation about location-based privacy and integrating services that share personal data into Web browsers..."

See also: the Channel Wire blog

W3C Geolocation API Specification
Andrei Popescu (ed), W3C Editor's Draft Specification

"The Geolocation API Specification defines an API that provides scripted access to geographical location information associated with the hosting device. Implementors should be aware that the specification is not stable; vendors interested in implementing this specification before it eventually reaches the Candidate Recommendation stage should join the mailing list and take part in the discussions... The Geolocation API defines a high-level interface to location information associated with the hosting device, such as latitude and longitude. The API itself is agnostic of the underlying location information sources. Common sources of location information include Global Positioning System (GPS) and location inferred from network signals such as IP address, RFID, WiFi and Bluetooth MAC addresses, and GSM/CDMA cell IDs. The API is designed to enable both "one-shot" position requests and repeated position updates, as well as the ability to query the last-known position. Location information is represented by latitude and longitude coordinates and optionally by reverse geocoded address information. The Geolocation API in this specification builds upon earlier work in the industry... Example use cases: (1) Finding points of interest in the user's area; (2) Annotating content with location information; (3) Automatic form-filling; (4) Showing the user's position on a map; (5) Turn-by-turn route navigation; (6) Alerting when points of interest are in the user's vicinity; (7) Up-to-date local information; (8) Location-tagged status updates in social networking applications. Example Requirements: must provide location data in terms of a pair of latitude and longitude coordinates; provide information about the accuracy of the retrieved location data; support "one-shot" position updates; allow an application to register to receive repeated position updates; allow an application to cheaply query the last known position; provide a way for the application to receive updates about errors that may have occurred while obtaining a location fix; allow an application to specify a desired accuracy level of the location information; be agnostic to the underlying sources of location information; allow an application to request address information as part of the location data... This specification is limited to providing a scripting APIs for retrieving location information associated with a hosting device. The location information is provided in terms of coordinates that apply to a geographic coordinate system. The scope of this specification does not include providing a markup language of any kind and does not include defining new URI schemes for building URIs that identify geographic locations.

See also: the public discussion list 'public-geolocation'

NIEM Ventures Forth
Joab Jackson, Government Computer News

A partnership between the Justice and Homeland Security departments, the NIEM [U.S. National Information Exchange Model] initiative is a program to develop information exchange standards for government agencies, be they state, local or federal. Managed by the Integrated Justice Information Systems (IJIS) Institute, NIEM is based on the Global Justice XML Data Model, a highly successful data model for sharing law enforcement information across local, state and federal agencies. Rather than working on another major update, the NIEM design team decided to tweak NIEM 2.0 with a number of new features, said Paul Wormeli, head of IJIS. One feature is version independence for separate domains. NIEM hosts several domains, each with its own standardized naming conventions -- one for intelligence work, one for law enforcement, one for emergency management. And each domain is managed by a different working group. Soon, each domain group will be able to update its vocabulary without waiting for a full release of NIEM. NIEM 2.1 will also include new vocabulary sets. Wormeli: "We're branching out into other disciplines; it will be the first version to offer a vocabulary for juvenile justice concerns, which uses slightly different terms than adult cases. Biometrics is also a new entry. Although some basic biometrics terms, as defined by the National Institute of Standards and Technology, were entered in Version 2.0, Version 2.1 will expand the schema for widespread use. Biometrics crosses a number of domains, including [the U.S. Visitor and Immigrant Status Indicator Technology] program as well as law enforcement applications." Since its introduction in 2005, NIEM has grown into perhaps the largest XML-based information-sharing vocabulary across government. DHS, and more than 18,000 law enforcement agencies, use NIEM for more than forty programs. New York City just adopted it for all exchanges of tax and health and human services information among agencies.

See also: on NIEM Naming and Design Rules

The Portable Contacts API: Killing the Password Anti-Pattern Once and For All
Dare Obasanjo, Blog

A common practice among social networking sites is to ask users to import their contacts from one of the big email service providers (e.g. Yahoo! Mail, Hotmail or Gmail) as part of the sign up process. This is often seen as a way to bootstrap the user's social network on the site by telling the user who in their email address book is also a user of the site they have just joined... Google has the Contacts Data API, Yahoo! has their Address Book API and Microsoft has the Windows Live Contacts API. Each of these is provides a user-centric authorization model where instead of a user giving their email address and password to random sites, they log-in at their email provider's site and then delegate authority to access their address book to the social network site. The only problem that remains is that each site that provides an address book or social graph API is reinventing the wheel both with regards to the delegated auth model they implement and the actual API for retrieving a user's contacts. This means that social networking sites that want to implement a contact import feature have to support a different API and delegated authorization model for each service they want to talk to even though each API and delegated auth model effectively does the same thing. Just as OAuth has slowly been increasing in buzz as the standard that will sweep away the various proprietary delegated auth models that we have on the Web today, there has been a similar effort underway by a similar set of dedicated individuals intent on standardizing contacts and social graph APIs on the Web. The primary output from this effort is the Portable Contacts API. I've read been reading latest draft specification of the Portable Contacts API and below are some of the highlights as well as some thoughts on them [summary/critique]... Except for the subpar work with regards to defining an XML serialization for the contacts schema this seems like a decent specification. If anything, I'm concerned by the growing number of interdependent specs that seem poised to have a significant impact on the Web and yet are being defined outside of formal standards bodies in closed processes funded by big companies. For example, about half of the references in the Portable Contacts API specs are to IETF RFCs while the other half are to specs primarily authored by Google and Yahoo! employees outside of any standards body (OpenSocial, OAuth, OpenSearch, XRDS-Simple, etc)...

See also: the Portable Contacts web site

An Extensible Markup Language (XML) Configuration Access Protocol (XCAP) Diff Event Package
Jari Urpalainen (ed), IETF Internet Draft

The SIP Events framework (IETF RFC 3265) describes subscription and notification conventions for the SIP protocol defined in RFC 3261. The Extensible Markup Language (XML) Configuration Access Protocol (XCAP) defined in RFC 4825 allows a client to read, write and modify XML formatted application usage data stored on an XCAP server. XCAP is a set of conventions for mapping XML documents and document components into HTTP URIs, rules for how the modification of one resource affects another, data validation constraints, and authorization policies associated with access to those resources. While the XCAP protocol allows several authorized users or devices to modify the same XML document, XCAP does not provide an effective synchronization mechanism (beyond polling) to keep resources equivalent between a server and a client. This memo defines an "xcap-diff" event package that, together with the SIP event notification framework and the XCAP-diff format, allows a user to subscribe to changes in an XML document, and to receive notifications whenever a change in an XML document takes place. There are three basic features that this event package enables with the XCAP-Diff change notification format. Firstly, it can list the URI references of XCAP documents from a collection located on an XCAP server. This is important when a subscriber is doing an initial synchronization or a comparison of existing server resources to the locally cached XCAP documents, for example. The version-history of document comparisons are based on the strong entity tag (ETag) values of XCAP documents which are also indicated with the XCAP-Diff format. Secondly, this event package can signal whenever a change is happening in those resources. The changes can be reported with three ways. The simplest model is that only document creations, updates and removals are indicated. The actual contents of those documents are not included in the notification and the subscriber uses the HTTP RFC 2616 protocol for a retrieval of document contents. The two more complex modes allow the changes of documents to be indicated straightaway with the XML-Patch-Ops RFC 5261 semantics inside the XCAP-Diff format. A client can then apply a conditional patch to locally cached documents based on the strong ETag values of documents. The most complex model produces the smallest documents but it doesn't necessarily show the full HTTP version-history information unlike the other, typically more verbose one. Lastly, XML element or attribute contents (XCAP components) can be received "in-band", that is straight within the XCAP-Diff notification format. For example, an XCAP element content can be requested and indicated without the need of a separate HTTP GET request. If this requested node either exists or is later created or modified, the notification body indicates its content. And similarly, the removals of subscribed XCAP components are reported, for example after a successful HTTP DELETE request.

See also: the IETF Session Initiation Protocol (SIP) Working Group

Real Web 2.0: Mastering the Creative Commons
Uche Ogbuji, IBM developerWorks

If the essence of Web 2.0 is in making it easier to share and re-use information, technology is only part of the conversation. Throughout the history of the Internet, lawyers have proven all too effective at taking away much of what we gain through invention and collaboration. For Web 2.0 to flourish, its community must be diligent about taking matters into its own hands and expressing clearly the rules for sharing specific content, images, video, audio, and other media. If it's easy for a person or for programs to determine the license established by copyright of such resources, it opens things up for creativity, innovation, and collaboration to take center stage. Creative Commons (CC) is an organization of lawyers, technical experts, and managers, with a very broad community, whose goal is to "use private rights to create public goods", by allowing creators to express degrees of licensing between the knee-jerk "all rights reserved" and public domain (in other words, "no rights reserved"). Creative Commons provides the legal framework and text of licenses that allow you to say that "some rights are reserved", and allows this to be clearly discovered by others, so that they can determine whether their use is compatible with your reservations. The lawyers are involved when these reusable licenses are crafted and updated, with support and feedback from the community, with the idea that afterwards, the sharing can proceed on the Web with much less legal interference... Web 2.0 is not just about what you produce, but how it can be combined with what's produced by others. CC also makes it easy to find works that match licensing criteria. Suppose you are producing a mashup for commercial purposes, and you want to use images that are okay for such use as one of the sources. You can combine your usual methods for keeping tabs on likely resources, such as following specialized Web feeds, with automated validation of the license, reading the format according to the file type. CC also has several partners among media publishing sites and search engines. You can get to these by clicking on the "Search CC Licensed Work", which includes search forms for Google, Yahoo, Flickr, and more, through iframes. In this article, you learn how to express CC licenses for your work, how to use public services for finding work from others you can use, and how to identify such work yourself.

See also: Creative Commons references

Open Source Workflow Solution: Nova Bonita
Gavin Terrill, InfoQueue

Bonita is a workflow open source solution for handing long-running, user-oriented processes providing out of the box workflow and BPM functionalities to handle your business processes. Bonita is compliant to the XPDL workflow standard and is downloadable under the LGPL License. "After 2 years of development, the Bonita project team have announced the release of Bonita 4.0 (also known as Nova Bonita). Bonita 4.0 is based on top of the PVM technology, and can be deployed as a lightweight BPM product that runs on the Standard (JSE) and Enterprise (JEE) Java platforms. Nova Bonita provides an integrated graphical environment for BPM development and execution environments, and comprises three modules: (1) Nova Bonita runtime: the Nova Bonita process engine. Processes can be deployed, executed and monitored through a rich API providing BPM services. (2) Nova Bonita console: a web 2.0 graphical interface fostering the user experience during BPM deployment, execution and monitoring phases. (3) Nova Bonita designer: a BPM development environment allowing graphical definition of processes, as well as BPM connectors for integration with existing system. Miguel Valdes: "... we decided to build up the Process Virtual Machine technology two years ago. Bonita 4.0 can now be embedded (as a BPM library) in any existing application or be deployed remotely as a traditional BPM server. In that sense, Bonita 4.0 comes with an Eclipse plug-in that allows easy development of BPM processes and related Java connectors. This plug-in can be easily added to developers Eclipse environment to speed up the development of BPM applications. Bonita 4.0 also meets Web 2.0 with a powerful BPM console that improves the user experience so it's not only targeted for developers and technical architects... XPDL has been the standard supported by Bonita from its early days. The standard has been evolving over the last few years to cover missing features such process to process communication or events support, and especially it has become the grammar to map the BPMN notation. Note that 7 of top 10 commercial BPM vendors are supporting XPDL... we decided to create the Process Virtual Machine (PVM) technology. Among others key features, the PVM allows multiple standards support. We have already added support for XPDL with Bonita 4.0, we will soon release the BPEL 2.0 extension in Orchestra 4.0 (also developed by the Bonita team) and Jboss is currently adding support for JPDL as well..."

See also: Standards for Business Process Modeling

Building BACnet: 2008 BACnet International Conference
Andy McMillan,

"This year's annual BACnet International conference, developed and produced in conjunction with Engineered Systems magazine, was all about connecting the dots in building automation. We looked at 'connecting the dots' wirelessly, at the device level, and at the application level with web services. We even talked about connecting the dots at the sustainability level with discussions of how BACnet can contribute to green initiatives, demand response programs and LEED certification. Celebrating past effort and current success is necessary but not sufficient in our rapidly changing world. We also need to accelerate development in the BACnet community to address new requirements and accommodate new technologies. It was clear at the conference that in several areas products and applications are pushing the envelope of the current specification. Web service applications, wireless devices and multi-device object models are a couple of areas where application requirements are driving suppliers to go beyond the limits of the current standard... The need for building and energy related information sharing will continue to increase as sustainability becomes more and more an enterprise-wide initiative. Several case studies in the web services session demonstrated the simplicity and development speed that make web services the preferred platform for BAS and energy management application information-sharing. Thanks to a lot of hard work by a few people a couple of years ago, the BACnet standard provides a specification for web services. The specification was designed around the fairly simple information exchanges that were anticipated at the time. As generally happens, though, the application requirements have continued to expand. Users are now asking for large-scale exchange of building and energy data both within their organizations and among the players in their own energy ecosystems..." [Note: At BACnet 2008, announcement was made for the incorporation of five addenda approved since the publication of BACnet 2004. Addendum 'T' provides a new definition for an XML syntax which can be used to represent building data in a consistent, flexible and extensible manner. According to BACnet Chair Dave Robin, "With this new IT-friendly way of representing building data, we are opening up a whole range of possible new ways to share data. XML can be used for exchanging files between systems, integrating buildings with energy utilities, and expanding enterprise integration with richer Web services..."]

See also: BACnet references

NetApp Faces Sun Lawsuit Loss
Chris Mellor, The Register

A judicial ruling in the NetApp_Sun IP lawsuit has effectively invalidated another NetApp patent. The US Patent Office also appears to be rejecting NetApp's key patents in the law suit. NetApp's position looks like it's crumbling. The dispute began with NetApp claiming that Sun's free distribution of its ZFS technology infringes NetApp's six patents for its WAFL (Write Anywhere File Layout) technology. WAFL is a core component of NetApp's SAN and filer products... NetApp sued Sun last year and Sun counter-sued, saying NetApp infringed 22 of its patents. The Sun strategy is to invalidate the six NetApp patents by identifying prior use of the technologies involved (prior art), meaning that NetApp did not itself invent the technology. There have been two strands to this. One has been to ask the US Patent and Trademark Office (PTO) to re-examine the six NetApp patents and test their validity. In June it granted re-examinations on five of the six and invalidated one of them, leaving one down and five to go. This begins to look bleak for NetApp. If these PTO decisions result in three more NetApp patents going down the tubes then that leaves just one NetApp patent in play, and not a core one at that. It is NetApp policy not to comment on the Sun lawsuit. Sun's win (if it is a win) and the PTO decisions potentially turn NetApp's WAFL IP into, well, IP waffle...

See also: on Sun's ZFS Flash initiative


XML Daily Newslink and Cover Pages sponsored by:

IBM Corporation
Oracle Corporation
Sun Microsystems, Inc.

XML Daily Newslink:
Newsletter Archive:
Newsletter subscribe:
Newsletter unsubscribe:
Newsletter help:
Cover Pages:

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: