The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: August 18, 2008
XML Daily Newslink. Monday, 18 August 2008

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
IBM Corporation http://www.ibm.com



Public Review for UN/CEFACT XML Naming and Design Rules Version 3.0
Michael Rowell (ed), UN/CEFACT Applied Technologies Group Draft

On behalf of the UN/CEFACT Applied Technologies Group, ATG2 Project Team Leader Mark Crawford announced the availability of a First Public Review for the "UN/CEFACT XML Naming and Design Rules Version 3.0" specification. NDR-ODP5-Public Review is open for comment through 20-September-2008. Mark writes: "ATG2 is pleased to announce that the NDR v3p0 is now at ODP5 Public Review. This version incorporates CCTS 3.0 and sets the stage for the forthcoming UCM ("UN/CEFACT Context Methodology Technical Specification"). It has been developed in partnership with representatives from a number of standards development organizations who have committed to its adoption: [e.g., ACORD, CIDX, GS1, HR-XML, OASIS Universal Business Language (UBL) Technical Committee, Open Application Group - OAGi]. This "XML Naming and Design Rules" specification defines an architecture and set of rules necessary to define, describe and use XML to consistently express business information exchanges. It is based on the World Wide Web consortium suite of XML specifications and the UN/CEFACT Core Components Technical Specification. The specification will be used by UN/CEFACT to define XML Schema and Schema documents which will be published and UN/CEFACT standards. It will also be used by other Standards Development Organizations who are interested in maximizing inter- and intra-industry interoperability. This technical specification has been developed to provide for XML standards based expressions of semantic data models representing business information exchanges. It can be employed wherever business information is being shared in an open environment using XML Schema to define the structure of business content. It describes and specifies the rules and guidelines UN/CEFACT will use for developing XML schema and schema documents based on CCTS conformant artefacts and information models developed in accordance with the UN/CEFACT CCTS Technical Specification Version 3.0. The intended audience for this UN/CEFACT specification are: (1) Members of the UN/CEFACT Applied Technologies Group who are responsible for development and maintenance of UN/CEFACT XML Schema; (2) The wider membership of the other UN/CEFACT Groups who participate in the process of creating and maintaining UN/CEFACT XML Schema definitions; (3) Designers of tools who need to specify the conversion of user input into XML Schema definitions adhering to the rules defined in this document; (4) Designers of XML Schema definitions outside of the UN/CEFACT Forum community. These include designers from other standards organizations and companies that have found these rules suitable for their own organizations.

See also: UN/CEFACT Applied Technologies Group


NIST Revises Guidance For Assigning FISMA Security Categories
William Jackson, Government Computer News

The National Institute of Standards and Technology (NIST) has updated its guidelines for mapping information in government information systems to categories that specify the types of security controls the data requires. The Federal Information Security Management Act requires that agencies assign levels of risk to information and information systems based on the likelihood and impact of exposure, modification or loss, and link the level of risk to appropriate security controls. The two-volume Special Publication 800-60 Revision 1, "Guide for Mapping Types of Information and Information Systems to Security Categories," is a revision of guidelines published in 2004. NIST also released for public comment a draft interagency report with test requirements for validating products for the Security Content Automation Protocol (SCAP). Draft NIST Interagency Report (IR) 7511, "Security Content Automation Protocol Validation Program Test Requirements," describes the requirements that products must meet to achieve SCAP validation. Independent laboratories accredited for SCAP testing by the NIST National Voluntary Laboratory Accreditation Program conduct the validations. SCAP is a NIST specification for expressing and manipulating security data in standardized ways. It can enumerate product names and vulnerabilities, including software flaws and configuration issues; identify the presence of vulnerabilities; and assign severity scores to software flaws. The specifications that comprise SCAP are as follows: (1) Common Vulnerabilities and Exposures (CVE), a dictionary of names for publicly known security-related software flaws; (2) Common Configuration Enumeration (CCE), a dictionary of names for software security configuration issues (e.g., access control settings, password policy settings); (3) Common Platform Enumeration (CPE), a naming convention for hardware, operating system (OS), and application products; (4) Extensible Configuration Checklist Description Format (XCCDF), an XML specification for structured collections of security configuration rules used by OS and application platforms; (5) Open Vulnerability and Assessment Language (OVAL), an XML specification for exchanging technical details on how to check systems for security-related software flaws, configuration issues, and patches; (6) Common Vulnerability Scoring System (CVSS), a method for classifying characteristics of software flaws and assigning severity scores based on these characteristics.

See also: SCAP Validation Program Test Requirements


NIST Digital Signatures Documents Available for Review
Sara Caswell, NIST Announcement

"NIST has revised the first drafts of Special Publication (SP) 800-106 "Randomized Hashing for Digital Signatures", and SP 800-107 "Recommendation for Applications Using Approved Hash Algorithms." The revisions have been made has after receiving comments from many public and private individuals and organizations. NIST also would like to announce that FIPS 198-1 ("The Keyed-Hash Message Authentication Code - HMAC") has already been approved and it is also freely available online." [1] SP 800-106 ("Randomized Hashing for Digital Signatures," Draft 2) "provides a technique to randomize messages that are input to a cryptographic hash function during the generation of digital signatures using the Digital Signature Algorithm (DSA), Elliptic Curve Digital Signature Algorithm (ECDSA) and RSA. Approved cryptographic hash functions for Federal government use are specified in Federal Information Processing Standard (FIPS) 180-3, the Secure Hash Standard (SHS). Digital Signatures shall be generated as specified in FIPS 186-3, the Digital Signature Standard (FIPS 186-3). Collision resistance is a required property for the cryptographic hash functions used in Digital Signature Applications. The intent of this randomization method is to strengthen the collision resistance provided by the cryptographic hash functions in digital signature applications without any changes to the core cryptographic hash functions and digital signature algorithms. A message will have a different digital signature each time it is signed if it is randomized with a different random value. The randomization method specified in this Recommendation is an approved randomized hashing method." The comment period for SP 800-106 closes on September 5, 2008. [2] SP 800-107, edited by Quynh Dang, provides security guidelines for achieving the required or desired security strengths when using cryptographic applications that employ the approved cryptographic hash functions specified in Federal Information Processing Standard (FIPS) 180-3, such as digital signature applications, Keyed-hash Message Authentication Codes (HMACs) and Hash-based Key Derivation Functions (HKDFs). A hash algorithm is used to map a message of arbitrary length to a fixed-length message digest. Federal Information Processing Standard (FIPS) 180-3, the Secure Hash Standard (SHS), specifies five approved hash algorithms: SHA-1, SHA-224, SHA-256, SHA-384, and SHA-512. Secure hash algorithms are typically used with other cryptographic algorithms. Cryptographic hash functions that compute a fixed- length message digest from arbitrary length messages are widely used for many purposes in information security. The comment period for SP 800-107 closes on October 9, 2008.

See also: FIPS PUB 198-1 Final (HMAC)


ISO Rejects National Bodies' Appeals Against OOXML
David Meyer, ZDNet Blog

The International Organization for Standardization has thrown out four appeals against its ratification of Microsoft's Office Open XML document format. The appeals were made by South Africa, Brazil, India and Venezuela, and generally centred on perceived improprieties in the Office Open XML (OOXML) voting process, along with unresolved technical issues. ISO announcement #1151 of 2008-08-15 ("ISO and IEC Members Give Go Ahead on ISO/IEC DIS 29500") says: "The two ISO and IEC technical boards have given the go-ahead to publish ISO/IEC DIS 29500, Information technology—Office Open XML formats, as an ISO/IEC International Standard after appeals by four national standards bodies against the approval of the document failed to garner sufficient support. None of the appeals from Brazil, India, South Africa and Venezuela received the support for further processing of two-thirds of the members of the ISO Technical Management Board and IEC Standardization Management Board, as required by ISO/IEC rules governing the work of their joint technical committee ISO/IEC JTC 1, Information technology. According to the ISO/IEC rules, DIS 29500 can now proceed to publication as an ISO/IEC International Standard. This is expected to take place within the next few weeks on completion of final processing of the document, and subject to no further appeals against the decision. The adoption process of Office Open XML (OOXML) as an ISO/IEC Standard has generated significant debate related to both technical and procedural issues which have been addressed according to ISO and IEC procedures. Experiences from the ISO/IEC 29500 process will also provide important input to ISO and IEC and their respective national bodies and national committees in their efforts to continually improve standards development policies and procedures."

See also: the ISO announcement


W3C Advances Pronunciation Lexicon Specification (PLS) Version 1.0
Paul Bagshaw, Daniel C. Burnett (et al, eds), W3C Technical Report

Members of the W3C Voice Browser Working Group have approved the publication of "Pronunciation Lexicon Specification (PLS) Version 1.0" as a Proposed Recommendation. Changes are documented in Appendix D. A companion "PLS 1.0 Implementation Report" with detailed Test Results is available; this PLS 1.0 Implementation Report includes 78 assertions and corresponding tests. Comments are welcome through 18-September-2008. The Pronunciation Lexicon Specification (PLS) provides the basis for describing pronunciation information for use in speech recognition and speech synthesis, for use in tuning applications, e.g., for proper names that have irregular pronunciations. Details: "The accurate specification of pronunciation is critical to the success of speech applications. Most Automatic Speech Recognition (ASR) and Text-To-Speech (TTS) engines internally provide extensive high quality lexicons with pronunciation information for many words or phrases. To ensure a maximum coverage of the words or phrases used by an application, application-specific pronunciations may be required. For example, these may be needed for proper nouns such as surnames or business names. The Pronunciation Lexicon Specification (PLS) is designed to enable interoperable specification of pronunciation information for both ASR and TTS engines. The language is intended to be easy to use by developers while supporting the accurate specification of pronunciation information for international use. The language allows one or more pronunciations for a word or phrase to be specified using a standard pronunciation alphabet or if necessary using vendor specific alphabets. Pronunciations are grouped together into a PLS document which may be referenced from other markup languages, such as the Speech Recognition Grammar Specification (SRGS) and the Speech Synthesis Markup Language. In its most general sense, a lexicon is merely a list of words or phrases, possibly containing information associated with and related to the items in the list. This document uses the term "lexicon" in only one specific way, as "pronunciation lexicon". In this particular document, "lexicon" means a mapping between words (or short phrases), their written representations, and their pronunciations suitable for use by an ASR engine or a TTS engine. Pronunciation lexicons are not only useful for voice browsers; they have also proven effective mechanisms to support accessibility for persons with disabilities as well as greater usability for all users. They are used to good effect in screen readers and user agents supporting multimodal interfaces..."

See also: the PLS 1.0 Implementation Report


Portal for the Olympic Games: Geographic Information for the Millions
Franz-Josef Behr and Yuan Ying, GIM International

This article describes the creation of a 'www.geobeijing.info' portal which provides information and links relevant to Beijing, the Olympic Games, and the ISPRS Congress. Users can search for streets and hotels, and access related information using side panels with hyperlinks. Venue locations, photos and traffic information are offered for the ISPRS Congress. During the 29th Olympiad and Paralympics in Beijing, a corps of over 1.5 million volunteers will provide visitor support. Some tasks, such as giving information on transport and geographical orientation, can be performed by geo-browsers. The authors developed a portal (www.geobeijing.info) for guiding visitors to Beijing during the ISPRS Congress and the Olympic Games. The maps are based on Google Maps satellite images, overlaid with additional information... Information given by commercial geo-browsers such as Google, Yahoo!, Microsoft or MapQuest on streets, venues etc. is not detailed enough to guide Olympic Games 2008 and ISPRS Congress visitors properly through Beijing. Technological developments in online mapping tools, however, enable developers to construct portals superimposed on geo-browsers and to add theme-specific information. Google Maps, written in JavaScript, discloses its classes, variables and internal structure in an application-programming interface (API). Developers can produce client-side scripts and server-side hooks that allow expanded and customised features (mashups) based on the API... Encoding of subway and venue information is done by XML, the meta-language used to define other (geo-related) languages such as Geography Markup Language (GML) or GeoRSS. Most programming languages support creation and parsing of XML data. JavaScript Object Notation (JSON) is intensively used throughout the web and is capable of combining a variety of features. Several of Google's web-based applications and services provide feed data in JSON format. The advantage, as compared to XML, is the compactness of data representation. On the other hand, one should be cautious with user-entered data, since it could contain malicious code. JSON has been extended into a geographical dialect, GeoJSON, intended to standardise the transfer of spatial features and how this can be instantiated on the client side. For bulk loading of geo-data, CSV (comma separated values) is used; this format is simple and supported by almost all spreadsheets and database management systems, is a pseudo standard even for GIS systems, and can be easily generated and evaluated on the client side. In this project some data has been encoded in a two-fold way. After digitisation, geometrical properties are converted into OGC's WKT format. The resulting string is embedded in a CSV-encoded string, together with information about id, name, layer name, geometry type and bounding box...

See also: Geography Markup Language (GML)


Web Leaders Seek JavaScript Harmony
Darryl K. Taft, eWEEK

The Ecma International standards body scraps its ECMAScript 4 effort to pursue a project called Harmony that aims to mend fences between warring parties. With Microsoft and Yahoo on one side and Adobe, Mozilla, Google and Opera seemingly on another, Ecma moved to take the middle road. ECMAScript 4 was to become JavaScript 2. The specification that was to lead to a revised JavaScript language has been taken off the table in lieu of a more harmonious approach to reaching standardization for the next version of the language that will take the Web forward. The Ecma International standards body, which oversees the ECMAScript language, has decided to scrap the ECMAScript 4.0 standardization effort to focus on a new project known as Harmony. Ecma also is working on a specification called ECMAScript 3.1, which was to be an interim spec before ECMAScript 4 was released. ECMAScript 4 was to become JavaScript 2, but no more. "First, the difference between ECMAScript 3.1 and ECMAScript 'Harmony' should be made clear. 3.1 is a 'bug fix' for the current JavaScript," said Alex Russell, co-creator of the Dojo Toolkit and a member of the Ecma technical committee working on the specification. "Harmony will pick up from 3.1 and try to introduce many of the types of features that were slated for ES4 but with different syntax and from a different approach. This is great news for everyone since it means that the standards body is going to be working toward a future [that] is deemed 'good' or 'bad' based on what's good for the language as it will exist in Web browsers. There is likely a mandate for the language outside of the browser environment, but designing the next language in a vacuum of real-world users of new syntax was going to hound the ES4 effort. That risk is now gone." [...] Dave McAllister, an Adobe engineer familiar with the process, said in an Aug. 15 blog post: Unfortunately, as is the case with many standards, the situation became a tug of war. Standards aren't just about the good of the community; they are also now recognized as competitive advantages. A new standard for ECMAScript thus became mired in a morass of bickering, infighting, and sometimes out and out name calling; the politics of competition. It became clear that members could not arrive at the consensus needed to allow a decade of advancements to be incorporated into the next generation of ECMAScript... The memo of Brendan Eich ("ECMAScript Harmony") provides an 'Executive Summary' and 'Detailed Statement'.


W3C Last Call Review for XProc: An XML Pipeline Language
Norman Walsh, Alex Milowski, Henry Thompson (eds), W3C Technical Report

W3C announced the publication of a Last Call Working Draft for the specification "XProc: An XML Pipeline Language." This document was produced by the W3C XML Processing Model Working Group which is part of the XML Activity. The Working Group considers this specification complete and finished. The review period for this document ends on 26-September-2008. "XProc: An XML Pipeline Language" defines a language for describing operations to be performed on XML documents. An XML Pipeline specifies a sequence of operations to be performed on a collection of XML input documents. Pipelines take zero or more XML documents as their input and produce zero or more XML documents as their output. A pipeline consists of steps. Like pipelines, steps take zero or more XML documents as their inputs and produce zero or more XML documents as their outputs. The inputs of a step come from the web, from the pipeline document, from the inputs to the pipeline itself, or from the outputs of other steps in the pipeline. The outputs from a step are consumed by other steps, are outputs of the pipeline as a whole, or are discarded. There are two kinds of steps: atomic steps and compound steps. Atomic steps carry out single operations and have no substructure as far as the pipeline is concerned, whereas compound steps control the execution of other steps, which they include in the form of one or more subpipelines. The pipeline document determines how the steps are connected together inside the pipeline, that is, how the output of one step becomes the input of another. How inputs are connected to XML documents outside the pipeline is implementation-defined. How pipeline outputs are connected to XML documents outside the pipeline is implementation-defined. An example is provided for a simple, linear XInclude/Validate pipeline that consists of two atomic steps, XInclude and Validate with XML Schema. The pipeline itself has two inputs, 'source' (a source document) and 'schemas' (a sequence of W3C XML Schemas). The XInclude step reads the pipeline input 'source' and produces a result document. The Validate with XML Schema step reads the pipeline input 'schemas' and the result of the XInclude step and produces its own result document. The result of the validation, 'result', is the result of the pipeline. For consistency across the step vocabulary, the standard input is usually named 'source' and and the standard output is usually named 'result'.

See also: Glassfish XML Pipeline Processor


First Edition of ISO/IEC 24754: Minimum Requirements for Specifying Document Rendering Systems
Secretariat, ISO/IEC JTC 1/SC 34 Announcement

A communication from the JISC Secretariat of ISO/IEC JTC 1/SC 34 [Toshiko Kimura] announces that the first edition of ISO/IEC 24754 was published on 2008-08-15. "The Secretariat wishes to take this opportunity to thank the Project editor (Mr. Keisuke KAMIMURA, Dr. Soon-Bum LIM), WG 2 Convener (Dr. Yushi KOMACHI) and all the WG 2 experts who have participated in the development of this standard. Document ISO/IEC JTC 1/SC 34 N1057 presents this 'Notice of Publication' for "ISO/IEC 24754:2008, Information Technology — Document Description and Processing Languages — Minimum Requirements for Specifying Document Rendering Systems." Summary: "When a structured document is interchanged between an originator and a recipient, the recipient refers to the style specifications that the originator provides to reconstruct the presentation. However, when the recipient does not have sufficient rendering functionality, it may fail to reconstruct the presentation output as the originator expected. In order to preserve presentation output in the course of interchange, the originator and recipient need to negotiate over functionalities referring to the specifications of document rendering systems. To satisfy this requirement, this standard provides the minimum requirements for specifying document rendering systems and document formats. This International Standard can apply to the document processing environment, where a document is given in a logically structured format which is expressed by a structure markup language, and the visual representation of the document is described by means of the external style and layout specifications which a style and layout specifications language provides. The visual representation of the given document is generated when the style and layout specifications are applied to the logical structure by a document rendering system. This International Standard provides an abstract list of the features that a document rendering system may have, thus providing a frame of reference, against which the user and implementor can compare the features of a document rendering system. However, this International Standard does not direct how each document rendering system should behave. This International Standard provides the minimum requirements to specify the features that a document rendering system which transforms formatting objects to rendering output. It may be used as a frame of reference, against which the user, implementer, or software agent may compare the features of a document rendering system. According to these requirements, the user may express what he or she expects of a document rendering system, the implementer may describe the functionality and capability of the document rendering system that he or she implements, and the software agent may negotiate a minimum set of functionality and capability that are shared across different document rendering system implementations..."

See also: the Final Committee Draft (ISO/IEC FCD 24754)


USB 3.0 One Step Closer to Reality
Andy Patrizio, InternetNews.com

Intel has announced the Extensible Host Controller Interface (xHCI) draft specification, revision 0.9, is now available for review, making USB 3.0 one step closer to reality. The news comes just ahead of the Intel Developer Forum (IDF) in San Francisco next week. The USB 3.0 architecture was first demonstrated at last year's IDF show and is on track for finalization this year, with products hitting the market in 2009. The chief improvement in USB 3.0 will be a ten-fold increase in throughput to 4.8 gigabits per second, or 600 megabytes per second. At that speed, USB 3.0, also known as "SuperSpeed USB," would be faster than the Serial ATA (SATA) controllers used on today's hard drives. This could have a major impact on the market, notes Gartner analyst Steve Kleynhans, who noted that USB 2.0 was fast enough to bring about cheap, mass market external storage... USB 2.0 led to the ability to use digital cameras and camcorders, since before there wasn't a way to connect the digital device to the PC before to get pictures and video off, at least with any great speed. So who knows what 3.0 will allow, said Kleynhans... Kleynhans said the speedy USB 3.0 port would not be a threat to SATA but could render IEEE 1394, a.k.a. FireWire, obsolete. That spec is trying to keep up the race, with a new specification, IEEE 1394-2008, released earlier this month that offers 3.2 Gb/sec of throughput. Intel's xHCI is something of vital interest to component designers, because it describes the hardware/software interface between system software and the host controller hardware for USB 3.0 and USB 2.0. This saves third parties from having to make their own controller, they can just use Intel's which is available royalty-free... In addition to the speed boost, USB 3.0 will feature a new form of power draw that will be much more merciful on laptops and battery-powered devices. USB 2.0, the current spec, is what's called poll-driven. When a USB device is plugged into a port, the computer keeps polling the port, which is a drain on batteries.

See also: the Intel announcement


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2008-08-18.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org