The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: July 08, 2009
XML Daily Newslink. Wednesday, 08 July 2009

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Oracle Corporation http://www.oracle.com



OASIS Announces Public Review for Test Assertions Guidelines Version 1.0
Stephen D. Green and Dmitry Kostovarov (eds), OASIS Committee Draft

Members of the OASIS Test Assertions Guidelines (TAG) Technical Committee, Chaired by Jacques Durand and Patrick Curran, have released a Committee Draft of the "Test Assertions Guidelines Version 1.0" specification for public review through September 7, 2009.

This OASIS TC was chartered to develop guidelines for the creation and usage of test assertions by any group involved in designing a specification or standard of which software implementations are expected to be developed. Test Assertions (TAs) associated with a specification or standard (a target specification) have recognized benefits: (1) improving the quality of a specification during its design, and (2) reducing the lead time necessary to create a test suite for the target specification.

The document for review "provides guidelines and best practices for writing test assertions along with mandatory and optional components of a test assertion model. Its purpose is to help the reader understand what test assertions are, their benefits, and most importantly, how they are created... Test assertions may help provide a tighter specification: Any ambiguities, contradictions and statements which require excessive resources for testing can be noted as they become apparent during test assertion creation. If there is still an opportunity to correct or improve the specification, these notes can be the basis of comments to the specification authors. Test assertions provide a starting point for writing a conformance test suite or an interoperability test suite for a specification that can be used during implementation. They simplify the distribution of the test development effort between different organizations while maintaining consistent test quality. By tying test output to specification statements, test assertions improve confidence in the resulting test and provide a basis for coverage analysis..."

See also: the announcement


Summary of W3C Workshop on Speaker Biometrics and VoiceXML 3.0
Judith Markowitz, Ken Rehor, Kazuyuki Ashimura (eds), W3C Workshop Report

W3C has now published a summary and full minutes of the "Workshop on Speaker biometrics and VoiceXML 3.0. The summit was held in Menlo Park, California, USA, on 5-6 March 2009. Participants from 15 organizations focused discussion on Speaker Identification and Verification (SIV) functionality within VoiceXML 3.0, and identifying and prioritizing directions for the functionality.

The major "takeaways" from the Workshop were confirmation that SIV fits into the VoiceXML space and creation of the "Menlo Park Model", a SIV available VoiceXML architecture. The Working Group will continue to discuss how to include the requirements expressed at the Workshop into VoiceXML 3.0 and improve the specification.

Workshop participants clarified Why SIV functionality should be added to VoiceXML: (1) The system would be more responsive, so VoiceXML could shorten customer perceived latency and provide performance benefits to the users; (2) It would be easier for developers to generate applications, because programming interface would be consistent with the way they use other VoiceXML resources and low-level operations would be hidden to them; (3) Adding SIV to a standard would make it portable and facilitate integration with Web model, because it makes SIV applications consistent with the model and provide efficiencies of scale in hosted environment; (4) Standardizations of easy to use API would minimize vendor lock-in and grow the market; (5) Support in VoiceXML enables SIV use (without the application server) with intermittent/offline connectivity.

See also: Voice Extensible Markup Language (VoiceXML) 3.0


Optional XML in Relational Databases: Using JAXB and Java Annotations
Stephen B Morris, IBM developerWorks

"Part 1 of this two-part article series demonstrated how to use JAXB to transform from the XML domain into Java code. This notion of inter-domain transformation is useful in many fields of study—for example, moving from the time domain into the frequency domain in the field of signal processing. A related example is that of XSLT, where a stylesheet is used to transform XML into some other text format, such as HTML.

Moving from XML data into relational data might seem a bit convoluted. You start with XSD and XML files and transform them with JAXB into corresponding Java classes. Then, you use ORM technology to populate the database. Isn't this a lot of work? Clearly, in this article, the problem domain is very small. But if you scale things up to an average- sized enterprise application, you'll typically have a great many XML objects. These are the business domain objects, and defining them in XML makes a lot of sense. For one thing, XML definition allows for object definition by non-programmers. The other big benefit is that you can then use JAXB to flawlessly transform the XML into Java entities. Once the XML objects are in the Java domain, the power of annotations comes to the fore, allowing you to minimally modify the Java classes to make them into persistent entities. From here, it's only a short trip to getting the data into a relational database..."


BACnet Protocol Well Positioned for Smart Grid Initiatives
Staff, ASHRAE Announcement

"The BACnet committee and its working groups recently met at ASHRAE's Annual Conference in Louisville, KY, to discuss how BACnet technologies can be used to aid development of standards to help Smart Grid efforts led by National Institute of Standards and Technology (NIST), as required by the Energy Independence and Security Act (EISA) of 2007. The committee's Utilities Integration Working Group, which has been engaging utility companies and working with national labs on grid-related technologies, including real-time pricing and automated demand response, is being re-chartered as the Smart Grid Working Group (SG-WG).

The standards group is making an update to the network security specifications for the BACnet protocol. The committee moved forward for publication an addendum that adds state-of-the-art digital signatures and encryption (SHA-256/HMAC and AES) to enable the creation of FIPS-compliant secure communications. This technology will be available on all BACnet media types and joins the capabilities of the certificate- based SSL/TLS that can be employed when using BACnet Web Services (BACnet/WS). Together, these technologies will serve the high security needs of the Smart Grid initiatives... During the conference, the committee advanced ten addenda to final publication stage, created four new addenda for first public review, and revised six addenda for additional public review..."

See also: BACnet specification references


Unique Identifier Quandary Exemplifies Health Net Obstacles
Greg Goth, IEEE Internet Computing

"One industry with perhaps the most to offer humanity via ubiquitous networked records—healthcare—lags far behind most others in adopting network technologies... The question of unique patient identifiers is proceeding along lines dictated by given nations' healthcare policies. Nations with single-payer insurance systems can base the digital identifier on already existing IDs. In the UK, for example, the 10-digit national patient identifier its National Health Service issues is mandated to become the default identification number by 18 September 2009, even though some local identifiers are also in place. The UK's National Patient Safety Agency (NPSA) said the mandate would go far in reducing medical errors caused by ambiguous identification within local identification schemes...

John Casillas, founding director of the Medical Banking Project, which encourages leveraging the existing financial network into healthcare applications, says the banking industry has spent more than a decade constructing interbank networks for every category of user, from large corporate transfers to consumer account balancing, and is poised to enter the healthcare market...

Financial service technology vendors are beginning to take notice of this potential market. In February, payment technology vendor Metavante announced its Wealthcare personal health record, which integrates with key data sources, including claims information received directly from payers, transactional feeds from benefit accounts, electronic medical record data, medication history, lab results, and home-monitoring systems..."

See also: The Medical Banking Project


WSO2 Looks to Make SOA Easier: Carbon Core Standalone
Paul Krill, InfoWorld

Open source SOA software vendor WSO2 has launched a stand-alone version of Carbon Core, which has served as the heart of Carbon, the company' componentized SOA framework built on the OSGi specification. Featuring capabilities for the user interface as well as for management, clustering, security, and logging, Carbon Core leverages OSGi component integration.

Developers can build composite services, the company said. Previously, Core could only be implemented by deploying WSO2 products such as its enterprise service bus, application server, or registry. The stand-alone Core lets developers bypass these products and deploy only the Carbon SOA components they want...

From the announcement: "The WSO2 Carbon Core is the heart of WSO2 Carbon, the industry's only fully componentized service-oriented architecture (SOA) platform, which is built on the OSGi specification. Using the standalone WSO2 Carbon Core—and any combination of the 100-plus components that comprise the WSO2 Carbon SOA platform—developers have unprecedented flexibility to create exactly the composite services they want. Moreover, because the hot-pluggable WSO2 Carbon components all work together automatically, developers can innovate new SOA solutions that are reliable, sustainable, and easy to maintain or grow... In addition to launching the Carbon Core, WSO2 is releasing new versions of its WSO2 WSAS and WSO2 ESB. WSO2 WSAS 3.1 offers enhanced security and run-time performance. WSO2 ESB 2.1 offers complete REST support, along with enhancements to the sequence editor, service-level policy support, and eventing..."

See also: the WSO2 annnouncement


Thales Key Manager Lowers Barriers to Encryption
Alex Woodie, IT Jungle

Thales next month will begin delivery of Thales Encryption Manager for Storage (TEMS), a new appliance-based key management offering designed to lower the barriers to encryption by making it easier for organizations to safeguard their encryption keys. By using key management standards, like the new Key Management Interoperability Protocol (KMIP) unveiled earlier this year, TEMS will eliminate the need for organizations to use multiple key management systems for different applications and platforms...

With the tide of data breaches and identity theft around the world continuing to rise, IT shops everywhere are looking to encryption as a way to safeguard their valuable data. Unfortunately, while industry mandates are pushing organizations to employ data encryption, the security practice is not as widespread as it could be, due to the real and perceived difficulties associated with managing the keys that encrypt and decrypt the data... Several groups of security experts and IT vendors are addressing the dilemma by proposing and developing a series of standards for the handling and management of encryption keys. Instead of requiring each embedded or stand-alone encryption application to have its own key management interface, the applications would just support a standard protocol or specification, and basically outsource the key management function to an application or device that's dedicated to that task... This is the thrust behind KMIP, a new encryption key management standard that was proposed by a group of vendors in February 2009.

See also: the OASIS KMIP specification activity


Semantic Web Technology to Get Update
Paul Krill, InfoWorld

"SPARQL, the query technology for the Semantic Web, is set for improvements for application development via a proposal before the World Wide Web Consortium (W3C). With the semantic Web, more refined groupings of data is enabled. SPARQL, considered the query mechanism for the semantic Web, has been used in applications such as complicated mashups that query data, said Ivan Herman, W3C Semantic Web Activity lead. SPARQL Query 1.1 is part of a proposal put forward by W3C earlier this month. The new version of SPARQL is anticipated for release in the fall of 2010...

Herman: "At the moment, many application patterns involve issuing repeated queries: issue a query, take the result, construct a new query based on that result and issue this new query, etc. Each query involves a communication on the wire, because the query engine might be somewhere on the Web... In the new environment, it will be possible to describe these features through standard means and client programs will be able to automatically find and interpret these descriptions and adapt their behavior accordingly..."

See also: SPARQL Query Language for RDF


An Extensible Markup Language (XML) Configuration Access Protocol (XCAP) Diff Event Package
Jari Urpalainen and Dean Willis (eds), IETF Internet Draft

The SIP Events framework (RFC 3265) describes subscription and notification conventions for the Session Initiation Protocol per (RFC 3261). The Extensible Markup Language (XML) Configuration Access Protocol (XCAP) allows a client to read, write and modify XML-formatted application usage data stored on an XCAP server. While XCAP allows authorized users or devices to modify the same XML document, XCAP does not provide an effective mechanism (beyond polling) to keep resources synchronized between a server and a client. This memo defines an "xcap-diff" event package that, together with the SIP event notification framework and the XCAP diff format, allows a user to subscribe to changes in an XML document, and to receive notifications whenever the XML document changes.

There are three basic features that this event package enables: (1) First, a client can subscribe to a list of XCAP documents' URLs in a collection located on an XCAP server. This allows a subscriber to compare server resources with its local resources using the URLs and the strong entity tag (ETag) values of XCAP documents, which are shown in the XCAP Diff format, and to synchronize them. (2) Second, this event package can signal a change in those documents in one of three ways. The first mode only indicates the event type and does not include document contents, so the subscriber uses HTTP to retrieve the updated document. The second mode includes document content changes in notification messages, using the XML-Patch-Ops format with minimal notification size. The third mode also includes document content changes in notification messages with the same XML-Patch-Ops format, but is more verbose, and shows the full HTTP version-history. (3) Third, the client can subscribe to specific XML elements or attributes (XCAP components) showing their existing contents in the resulting XCAP Diff format notification messages...

See also: the IETF Session Initiation Protocol (SIP) Working Group


XBRL US Pacific Rim Technology Workshop to Feature Case Studies and Advanced Topics in XML Development
Staff, XBRL US Announcement

XBRL US, a nonprofit consortium for XML business reporting, announced that detailed case studies and advanced topics in XML and XBRL development will be presented at its upcoming Pacific Rim Technology Workshop to be held at Hitachi Data Systems Headquarters in Santa Clara, California, USA, on July 28-30, 2009.

Key topics that will be covered include: (1) Practical, hands-on taxonomy maintenance workshop; (2) Versioning discussion panel—the business requirements and proposed solutions; (3) Tagging approaches, with examples of three different methods; (4) Data quality and validation; (5) Database and business intelligence: discussion of the challenges in storing, querying and retrieving XBRL data to maximize comparability of information...

XBRL (Extensible Business Reporting Language) is a royalty-free, open specification for software that uses XML data tags to describe business and financial information for public and private companies and other organizations. XBRL US is the non-profit consortium for XML business reporting standards in the United States and is a jurisdiction of XBRL International. It represents the business information supply chain, including accounting firms, software companies, financial databases, financial printers and government agencies. Its mission is to support the implementation of XML business reporting standards through the development of taxonomies relevant for use by US public and private sectors, working with a goal of interoperability between sectors, and by promoting adoption of these taxonomies through the collaboration of all business reporting supply chain participants.

See also: the XBRL Technology Workshop Agenda


The Metalink Download Description Format
Anthony Bryan, Tatsuhiro Tsujikawa (et al, eds.), IETF Internet Draft

An updated version of The Metalink Download Description Format has been published as an IETF Internet Draft. The document defines Metalink Documents, which use an XML-based download description format. Appendix B provides the RELAX NG Compact Schema. The layout and content of the document relies heavily on work pioneered in the Atom Syndication Format as specified in IETF RFC 4287.

"Metalink is an XML-based document format that describes a file or lists of files to be added to a download queue. Lists are composed of a number of files, each with an extensible set of attached metadata. For example, each file can have a description, checksum, and list of URIs that it is available from.

The primary use case that Metalink addresses is the description of downloadable content in a format so download agents can act intelligently and recover from common errors with little or no user interaction necessary. These errors can include multiple servers going down and data corrupted in transmission..."

Note: Specification source files (including XML format) are available from the SourceForge SCM Repositories; discussion is available from logs of the Google Groups Metalink Discussion.

See also: Atom references


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2009-07-08.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org