The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: March 05, 2007
XML Daily Newslink. Monday, 05 March 2007

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
IBM Corporation http://www.ibm.com



DITA Version 1.1 Specification Released for Public Review
Staff, OASIS Announcement

Members of the OASIS Darwin Information Typing Architecture (DITA) Technical Committee have published a Version 1.1 Committee Draft specification for public review. The public review period ends May 04, 2007. The DITA Version 1.1 release consists of an architectural specification, a language specification, a set of DTDs, and an equivalent set of Schemas. The Darwin Information Typing Architecture (DITA) is "an XML-based, end-to-end architecture for authoring, producing, and delivering readable information as discrete, typed topics. It is designed to support: managing readable information; reusing information in many different combinations and deliverables; creating online information systems such as User Assistance (help) or web resource; creating minimalist books for easier authoring and use. It supports: (1) topic-oriented authoring: creating a unit of information for a single subject, where topics can then be assembled into help systems or books that require a particular selection and organization of subjects; (2) information typing: identifying the type of topic, such as task, concept, reference, example, and so on; (3) specialization: extensibility with inheritance, which allows the creation of new types that inherit processing rules from existing types. For example, API documentation is a particular kind of reference information and requires more specific rules and descriptive markup than a generic reference type; as a result, topics from different domains with different markup and markup rules can be built together into one help file, Web site, or book. The DITA language reference describes the elements that comprise the topic DTD and its initial, information-typed descendents: concept, reference, task, and glossentry. It also describes DITA maps DTD and its current specialization (bookmap), as well as various topic and map based DITA domains. The separate DITA Architectural Specification includes detailed information about DITA specialization, when to use each topic type, how topics and maps interact, details of complex behaviors such as conref and conditional processing, and many other best practices for working with DITA.

See also: DITA references


W3C Announces Semantic Web Services (SWS) Testbed Incubator Group
Staff, W3C Announcement

The mission of the newly formed W3C SWS Testbed Incubator Group, part of the W3C Incubator Activity, is to develop a standard methodology for evaluating semantic web services based upon a standard set of problems, and develop a public repository of such problems. The SWS Testbed Incubator Group sponsored by W3C Members Wright State University, Stanford University, DERI University of Innsbruck, and the National University of Galway, Ireland. The Semantic Web Services Testbed Incubator will benefit from interactions with technical experts in the SWS Challenge Workshop, OASIS Semantic Execution Environment (SEE) Technical Committee, DERI Innsbruck, and the Stanford Logic Group. From the Charter: "The approach is to analyze the process, infrastructure, and results of the Semantic Web Services Challenge workshops. In particular, we hope to establish a set of standard problems related to the use of Semantic Web Service technologies. Part of each problem description will also be a set of Web Service Interface Definitions (in WSDL). Along with the set of problems we develop a methodology for automatically verifying the messages that are exchanged in order to solve a particular problem. A reference implementation of the infrastructure can then automatically verify a problem solution by analysis of web service message exchanges. Additionally a peer review-based methodology for determining the software engineering value of the technologies used to solve the problems shall be established. The rationale for the incubator is to standardize the evaluation of emerging semantic web service technologies. There are many claims for such technologies in academic workshops and conferences. However, there is no scientific method of comparing the actual functionalities claimed. More important, there is no way for industry to evaluate the robustness and applicability of these technologies. Progress in scientific development and in industrial adoption is thereby hindered. The SWS Challenge Workshops provides a major first step toward filling these lacunae. The workshops are an ongoing experiment in developing a methodology for evaluating the functionality (versus performance) of semantic web service technologies. Based upon our first year's experience with a small group, we feel that we can now involve a larger community in this development, with a aim to standardizing the methodology and infrastructure.

See also: the W3C Incubator Activity


Enterprise Resource Planning (ERP) Meets SOA
Joab Jackson, Government Computer News

The worlds of enterprise resource planning and service-oriented architecture are coming together. Three of the major software vendors of ERP software are moving their own platforms to ones that support Web services. Oracle Corp. is rolling out its Fusion platform, which updates the PeopleSoft HR software with Web services interfaces. Already, its Fusion Middleware allows users to build their own composites, or applications that reuse already-existing functionality in other programs, according to Wayne Bobby, vice president for solutions for finance and administration at Oracle Federal. Likewise, SAP AG of Waldorf, Germany, has migrated its MySAP ERP software to a new Web services-based platform called Netweaver. It is now exposing all the core functionality as Web services. So far, more than 1,500 functions are available. "We are going to expose every single element of our solution as a Web service," said David Ditzel, director of public services technology solutions for the company. In a similar move, CGI Inc. of Montreal has migrated its federal ERP software, called Momentum, to a Java 2 Enterprise Edition-based platform, allowing developers to easily hook their own J2EE applications into CGI's software. Michael Grim, head of public services business development for SAP, noted that SAP has long published the interfaces to its software under the name of Business Application Programming Interfaces, or BAPI. It also has offered a Cobol-like programming language, called Advanced Business Application Programming. What is new is that the company is using industry standards. Grim admits the old interfaces were 'SAP-centric.' Using Netweaver, an organization could work with the SAP software through Web services and the Java programming language, capabilities that are more broadly available. So a process such as combining Adobe's PDF support software and MySAP has grown dramatically easier, Ditzel claimed. Oracle offers a similar approach with its Fusion Middleware, allowing customers to build unique applications by using Web services and J2EE specifications. Such a composite-based approach is the basis for SOA. Last summer, OASIS formally defined the concept in its SOA Reference Model as 'a paradigm for organizing and utilizing distributed capabilities that may be under the control of different ownership domains.'


Novell Ships Translator for OpenXML as Fruit of Microsoft Partnership
John Fontana, Network World.com

Amid the simmering debate over open file formats, Novell has released a translator [OpenOffice.org OpenXML Translator] that lets users open and save Microsoft's OpenXML files (.docx) in versions of the OpenOffice.org word processing program. Novell's translator tool follows two similar tools introduced last month by Sun and by a group funded by Microsoft that developed an open-source translator. Novell's translator is the first by-product of its wide ranging technology partnership signed with Microsoft in November. The pair promised to develop translators to make it easier to work with OpenXML and the Open Document Format (ODF). Besides the word processing translator, others for presentation and spreadsheets are in the works. OpenXML translator provides support for opening and saving Microsoft Open XML-formatted word processing documents in OpenOffice.org. OpenXML is the default format in Office 2007 and ODF is supported by OpenOffice.org, Novell includes their version of OpenOffice in their Suse Linux Enterprise Desktop 10. The Novell Translator requires one of four version of OpenOffice.org: Version 2.0.4 or later of the Novell edition for Windows, version 2.0.4 or later for SUSE Linux Enterprise Desktop 10 and openSUSE 10.1, OpenOffice.org 2.0 package for SUSE Linux 9.3 and 10.0, and OpenOffice.org 2.0.4 package for openSUSE 10.2... Microsoft is working hard to back interoperability. The open-source translator developed by engineers funded by Microsoft was posted last month on the open-source site SourceForge under an open source Berkeley Software Distribution (BSD) license. But ODF backers are also working hard to ensure users can work with both document formats. Last month, Sun released the StarOffice 8 Conversion Technology Preview, a plug-in for Microsoft Word 2003.

See also: the download page


Avaya Talks Up Voice as a Service: New VoIP Offering Melds Voice, SOA
Eric Knorr, InfoWorld

The great thing about SOA is that when you add a service, you instantly increase the potential of any application that can consume that service. Today, Avaya threw a new set of VoIP services into the enterprise SOA mix in the form of its CEBP (Communications Enabled Business Processes) solution. For most enterprises, the benefits of VoIP have been limited to the cost savings of retiring legacy phone systems and the added productivity of a few voice/data applications—almost exclusively for call centers. CEBP exposes an array of voice communications features as Web services, which can be called upon by any number of applications from IT operations to supply chain management. Avaya sees CEBP as a way for enterprises to optimize business processes and respond swiftly to events. For example, based on certain criteria, a temporary interruption at a manufacturing plant might cause SAP R/3 to ping CEBP, which would send an automated advisory message with a return receipt to the appropriate manager via phone, e-mail, or SMS. In the event of a total meltdown, the Notify & Conference feature could be invoked, initiating voice calls to a preset list of people and dropping them all into an emergency conference call. In other instances, CEBP could merely ensure that composite workflow applications, oft cited as a big payoff of SOA, aren't stopped in their tracks by people who aren't sitting at a computer and can't respond promptly. Avaya is not the first vendor to provide a Web services platform for voice communications capabilities. Last September, BlueNote Networks introduced its SessionSuite SOA Edition, designed to enable developers to embed telephony capabilities in a range of applications. But Avaya's large customer base and professional services group could help jump start the use of voice-based Web services.


Zend Goes Straight to The PHP Core
Sean Michael Kerner, InternetNews.com

For more than a decade, PHP developers have had little choice in where they actually get the latest version of PHP: either PHP.net or as a package from a Linux distribution. Zend, the commercial backer of PHP, is now offering another choice. Zend Core 2.0, the first Zend Core release for the broader PHP community, is a PHP distribution that benefits from additional testing, bundled features, applications and support that also includes an update service. And though it extends PHP in a number of ways, it's still open source. "Zend Core is our version of PHP; it's a PHP distribution and it's best to look at it as related to PHP as how Red Hat Linux is related to Linux," according to Mark de Visser, Zend chief marketing officer. "Yes you can get these things yourself from the various open source sources, but it's more convenient and there are benefits to get it from Zend where we assemble the whole stack for you." Zend Core 2.0 includes the latest PHP 5.2.1 version, which benefits from the joint work between Zend and Microsoft to improve PHP on Microsoft Windows. PHP 5.2.1 was officially released by the PHP.net open source community on Feb. 8. Zend had previously offered versions of Zend Core specifically for Oracle and IBM DB2. Additionally, Zend Core 2.0 includes MySQL 5.0, which makes PHP-enabling the open source database easier than if PHP and MySQL were to be installed separately.

See also: Zend Core


SOA Composite Business Services: The Common Event Infrastructure (CEI)
Javier Garcia, DuoWei Sun, Zhi Gan; IBM developerWorks

This article is the fourth in a series that considers the development of composite applications to enable business services. In order to determine if a composite application is meeting the stated business goals, the application needs to be measurable. This article examines how to develop measurable composite applications with the help of three reusable artifacts that are based on the Common Event Infrastructure. It describes the role of business-level events and presents three artifacts you can use to generate, query, and view business-level events in the context of a composite application. With the help of these reusable artifacts, your composite application can be measurable, thus laying the groundwork for improving the entire business process. CEI processes Common Base Events (CBEs) as defined by the CBE specifications. CEI provides a standard XML-based format for events, which enables generic mechanisms to both log and query events. Built-in support for CEI in WebSphere Integration Developer generates CBE events. For example, you can select a Business Process Execution Language (BPEL) activity and then select activity-started or activity-stopped events, which automatically generate CBEs. However, you might need to generate business-level events that do not correspond to WebSphere Integration Developer events. You can use the Business Process Engine (BPE) API for this. However, because this API exposes the CBE format as well as other CEI-specific attributes, the relevant parts of the API are contained in an easy-to-use set of artifacts that support logging, querying, and viewing events.


Conference Information Data Model for Centralized Conferencing (XCON)
O. Novo, G. Camarillo (et al., eds), IETF Internet Drafts

Members of the IETF Centralized Conferencing (XCON) Working Group have released a level -04 version of the "Conference Information Data Model for Centralized Conferencing (XCON)" specification. Conference objects are a fundamental concept in Centralized conferencing, as described in the XCON Conferencing Framework specification. A conference object contains data that represents a conference during each of its various stages (e.g., reserved, started, running, ended, etc.). A conference object contains the core information of a conference (i.e., capabilities, membership, roles, call control signaling, media, etc.) and specifies who, and in which way, can manipulate that information. Conference Objects are instantiations of the conference information data model defined in this document. This document defines an Extensible Markup Language (XML)-based conference information data model for centralized conferencing (XCON). A conference object, which can be manipulated using a conference control protocol, at a conference server represents a particular instantiation of this data model. The conference information data model defined in this document is an extension of (and thus, compatible with) the model specified in the Session Initiation Protocol (SIP) Event Package for Conference State. In accordance with the XCON framework document, the Conference Object is a logical representation of a conference instance. The conference information schema contains core information that is utilized in any conference. It also contains the variable information part of the Conference Object. This specification defines some document fragments in RELAX NG format. An XML-encoded example is provided for a conference information document. The IETF XCON working group was chartered to develop a standardized suite of protocols for tightly-coupled multimedia conferences, where strong security and authorization requirements are integral to the solution. Tightly-coupled conferences have a central point of control and authorization (known as a focus) so they can enforce specific media and membership relationships, and provide an accurate roster of participants. The media mixing or combining function of a tightly-coupled conference need not be performed centrally, however. This effort is intended to enable interoperability in a commercial environment which already has a number of non-standard implementations using some of the protocols.

See also: the WG Charter


New Tools Show True Potential of Grid-Based Data Mining
Staff, DataMiningGrid Project

Developers announced the release of DataMiningGrid Version 1.0 beta as of March 3, 2007. It provides WSRF-compliant tools and services for data mining in grid computing environments, based on Globus Toolkit 4, Condor, and Triana. The distribution is available from SourceForge. The partners in this effort were the University of Ulster (United Kingdom), University of Ljubljana (Slovenia), the Technion (Israel), Fraunhofer Institute (Germany) and automobile company DaimlerChrysler. The release a general purpose software facilitating user-friendly, Grid-based data mining, which can only be used in connection to Globus toolkit version 4, Condor (optional) local scheduler, and Triana Workflow Editor and Manager. The real power of Grid computing lies in sharing resources across a network. These can be CPU cycles, storage, peripherals, network bandwidth, data and software. Ultimately, this will lead to the grand goal envisioned by Grid researchers in which grid users will be able to seamlessly access and harness geographically widely distributed computing resources as if they were using a local system. By using a series of mature or near mature tools to manage issues like scheduling, workflow management, and data access and integration, DataMiningGrid does not reinvent the wheel and focuses on the core problem: extracting relevant information from vast data sets across a grid... "The Data Mining Tools and Services for Grid Computing Environments (DataMiningGrid) Consortium is developing tools and services for deploying data mining applications on the grid. Future and emerging complex problem-solving environments are characterised by increasing amounts of digital data and rising demands for coordinated resource sharing across geographically dispersed sites. Next generation grid technologies are promising to provide the necessary infrastructure facilitating seamless sharing of computing resources. Currently there exists no coherent framework for developing and deploying data mining applications on the grid. The DataMiningGrid project addreses this gap by developing generic and sector-independent data-mining tools and services for the grid. The DataMiningGrid project was formed as a shared cost Strategic Targeted Research Project (STREP) granted by the European Commission (grant no. IST-2004-004475), and part of the Sixth Framework Programme of the Information Society Technologies Programme (IST)."

See also: the announcement


Push for Open Access to Research
Michael Geist, BBC News

In this article, Internet law professor Michael Geist takes a look at a fundamental shift in the way research journals become available to the public. Last month five leading European research institutions launched a petition that called on the European Commission to establish a new policy that would require all government-funded research to be made available to the public shortly after publication. That requirement -- called an open access principle—would leverage widespread internet connectivity with low-cost electronic publication to create a freely available virtual scientific library available to the entire globe. Despite scant media attention, word of the petition spread quickly throughout the scientific and research communities. Within weeks, it garnered more than 20,000 signatures, including several Nobel prize winners and 750 education, research, and cultural organisations from around the world. In response, the European Commission committed more than 51 million euros towards facilitating greater open access through support for open access journals and for the building of the infrastructure needed to house institutional repositories that can store the millions of academic articles written each year. The European developments demonstrate the growing global demand for open access, a trend that is forcing researchers, publishers, universities, and funding agencies to reconsider their role in the creation and dissemination of knowledge. In many countries, government funding agencies in the sciences, social sciences, and health sciences dole out hundreds of millions of dollars each year to support research at national universities. University researchers typically published their findings in expensive, peer-reviewed publications, which were purchased by those same publicly-funded universities. The model certainly proved lucrative for large publishers, yet resulted in the public paying twice for research that it was frequently unable to access. The Directory of Open Access Journals, a Swedish project that links to open access journals in all disciplines, currently lists more than 2,500 open access journals worldwide featuring a library in excess of 127,000 articles. Moreover, the cost of establishing an open access journal has dropped significantly.


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2007-03-05.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org