The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: March 12, 2007
XML Daily Newslink. Monday, 12 March 2007

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
SAP AG http://www.sap.com



Last Call: The Atom Publishing Protocol
Joe Gregorio and Bill de hOra (eds), IETF Internet Draft

The Internet Engineering Steering Group (IESG) announced that it has received a request from the IETF Atom Publishing Format and Protocol WG (ATOMPUB) to consider "The Atom Publishing Protocol" as a Proposed Standard. The document is in Last Call review, and the the IESG plans to make a decision in the next few weeks. Public comments are solicited on this action, and may be sent to the IETF mailing lists by 2007-03-26. Appendix B supplies the RELAX NG Compact Syntax Grammar for the Atom Protocol. "The Atom Publishing Protocol is an application-level protocol for publishing and editing Web resources using HTTP and XML 1.0. The protocol supports the creation of Web resources and provides facilities for: (1) Collections: Sets of resources, which can be retrieved in whole or in part; (2) Services: Discovery and description of Collections; (3) Editing: Creating, updating and deleting resources. The Atom Publishing Protocol is different from many contemporary protocols in that the server is given wide latitude in processing requests from clients. Atom Protocol Document formats are specified in terms of the XML Information Set, serialized as XML 1.0. The Atom Publishing Protocol uses HTTP methods to author Member Resources as follows: GET is used to retrieve a representation of a known resource. POST is used to create a new, dynamically-named, resource. When the client submits non-Atom-Entry representations to a Collection for creation, two resources are always created - a Media Entry for the requested resource, and a Media Link Entry for metadata (in Atom Entry format) about the resource. PUT is used to update a known resource. DELETE is used to remove a known resource. The Atom Protocol only covers the creation, update and deletion of Entry and Media resources. Other resources could be created, updated, and deleted as the result of manipulating a Collection, but the number of those resources, their media-types, and effects of Atom Protocol operations on them are outside the scope of this specification. Along with operations on Member Resources, the Atom Protocol defines Collection Resources for managing and organizing Member Resources. The representation of Collections are Atom Feed documents, and contain the IRIs of, and metadata about the Collection's Member Resources. The Atom Protocol does not make a structural distinction between Feeds used for Collections and other Atom Feeds. The only mechanism that this specification supplies for indicating a Collection Feed is its appearance in a Service Document." The IESG is responsible for technical management of IETF activities and the Internet standards process. As part of the ISOC, it administers the process according to the rules and procedures which have been ratified by the ISOC Trustees. The IESG is directly responsible for the actions associated with entry into and movement along the Internet "standards track," including final approval of specifications as Internet Standards.

See also: the IESG web site


Linking Service to Open Access Repositories
Shigeki Sugita, Shin Kataoka (et al.), D-Lib Magazine

Link resolvers are extremely effective tools that offer appropriate means of obtaining primary documents from licensed e-journals, open access journals, library print holdings and through Interlibrary Loan requests. Link resolver software and services are now available from a number of vendors and have been deployed in many types of organizations all over the world. However, link resolvers have not been offering satisfactory article-level resolution for Open Access (OA) documents that have been accumulated in repositories such as arXiv and RePEc, and in institutional repositories (IR). In May 2006, five universities and an institute in Japan initiated the Access path to Institutional Resources via link resolvers (AIRway) Project. The Openly Informatics Division of OCLC, which is the vendor of the Cate link resolver software, joined this project to provide implementation of the strategy in a production link resolver. The link resolver we used in this demonstration project is the production version of Cate, an OpenURL solution produced by OCLC Openly Informatics. Cate is implemented in a three-tier architecture. The back end is a MySQL 4.1 database running on a linux server; Middleware is an XML data source implemented in a Java Servlet running in the Tomcat Servlet container on the same server running the MySQL server. The front end link resolver servlet runs in Tomcat on a second server and uses an open-source dynamic text engine called "JSText" to combine data from the data source with presentation templates specified in the server instance. To keep the link resolver extremely responsive, we used a javascript-based technique to incorporate an IR query into the user's result page. In this technique, a remote javascript is invoked to write out dynamic content into the user's web page. In our case, the javascript is composed using a separate Cate thread that retrieves an xml response from the IR server, then invokes XSLT on the server-side to format the javascript response.


Setting the Foundations of Digital Libraries
Leonardo Candela, Donatella Castelli (et al.), D-Lib Magazine

The term "Digital Libraries" corresponds to a very complex notion with several diverse aspects and cannot be captured by a simple definition. A robust model of Digital Libraries encapsulating the richness of these perspectives is required. This need has led to the drafting of "The Digital Library Manifesto", the aim of which is to set the foundations and identify the cornerstone concepts within the universe of Digital Libraries, facilitating the integration of research results and proposing better ways of developing appropriate systems. The Manifesto is a result of the collaborative work of members of the European Union co-funded DELOS Network of Excellence on Digital Libraries. It exploits the collective understanding that has been acquired, over more than a decade, on Digital Libraries by European research groups active in the Digital Library field, both within DELOS and outside, as well as by other groups around the world. This article presents the core parts of the Manifesto that introduce the entities of discourse of the Digital Library universe. There are three distinct notions of "systems" developed along the way: Digital Library, Digital Library System, and Digital Library Management System; these correspond to three different levels of conceptualization of the universe of Digital Libraries... Despite the seeming richness and diversity of existing digital libraries, in actuality, there is only a small number of core concepts defined by all systems. These concepts are identifiable in nearly every Digital Library currently in use. They serve as a starting point for any researcher who wants to study and understand the field, for any system designer and developer intending to construct a Digital Library, and for any content provider seeking to expose its content via digital library technologies. Six core concepts provide a foundation for Digital Libraries. Five of them appear in the definition of Digital Library: Content, User, Functionality, Quality, and Policy; the sixth one emerges in the definition of Digital Library System: Architecture.


Microsoft Guns Open XML onto ISO Fast Track
Eric Lai, ComputerWorld

The International Standards Organization (ISO) agreed Saturday to put Open XML, the document format created and championed by Microsoft Corp., on a fast-track approval process that could see Open XML ratified as an international standard by August. That's despite lingering opposition to Open XML by several key voting countries, including some of whom whose governments are moving forward to adopt the alternative Open Document Format for Office Applications (ODF) format, which the ISO approved as a standard last year. According to an e-mail sent Saturday by Lisa Rajchel, the secretariat of ISO's Joint Technical Committee (JTC-1) on Information Technology, the Open XML proposal, along with comments and criticism by nations that have already reviewed it, will be put on the ISO's five-month balloting process. The e-mail did not give a date when the proposal would officially be put on a ballot and distributed to all 157 countries represented in the ISO, though it is likely to happen this week, according to Stacy Leistner, director of communications at the American National Standards Institute (ANSI), which is assisting the ISO in this issue. Microsoft did not immediately return an e-mailed request for comment. IBM, through a spokesman, declined to comment. IBM has consistently opposed Open XML's approval, and Microsoft has accused IBM, which is supporting ODF in its applications such as Lotus Notes and Workplace, of inappropriate meddling in Open XML's approval process. Rajchel wrote that she decided to move Open XML forward after consulting with staff at the International Technology Task Force. She did not mention that the 6,000-page proposal, submitted by another standards body, Ecma International, had garnered comments and criticism from 20 out of the 30 countries sitting on the JTC-1 committee.


Open-Xchange (Partially) Embraces GPL
Sean Michael Kerner, InternetNews.com

Open source isn't always synonymous with collaborative community development, even when it comes to open source collaboration applications. Open-Xchange is hoping to change that for its open source collaboration suite with the launch of a new community project partially licensed under the GPL. The new collaboration project comes on the heels of Open-Xchange's recent big ISP win with 1&1 Internet and will see an open source project being set up around the Open-Xchange 1&1 MailXchange server. The server will be released under the GPL, while the AJAX user interface for the server is being made available under the Creative Commons Attribution-NonCommercial-ShareAlike 2.5 license. Just don't expect to be able to take the newly available components and be able to directly clone what 1&1 has. "Open-Xchange makes a clear distinction between the source code related to the program and digital content/trademarks/Java browser script code," Paul Sterne, CFO and general manager of Americas for Open-Xchange said. "The source code of the project, program and digital content, are freely available to use, share and change/remix." Sterne added that right now, Open-Xchange has released the source code to two program components—the collaboration server and the administration module. The third and fourth program components, the installer for Ubuntu and the Wiki OXtender, will be released as soon as they have been vetted by the community.


WS-Policy 1.5 Goes to CR
Chris Ferris, Blog

The W3C Web Services Policy 1.5 - Framework and Attachment specifications have transitioned to Candidate Recommendation status. This latest milestone for WS-Policy is right on schedule (amazingly enough) with the WG's charter. This puts us on track to reach Proposed Recommendation in July 2007, also consistent with our chartered schedule, and Recommendation shortly thereafter. The Candidate Recommendation phase is also known in W3C circles as the "Call for Implementations" phase (CFI). We already have two published endpoints doing interoperability testing of the set of test scenarios that the WG has agreed upon as the exit criteria for the CFI phase. We will be holding an interop workshop, coincident with the up-coming WS-Policy WG face-to-face meeting later this month in Sunnyvale, Ca. and another in May in Ottawa. It should be noted that participation in this interoperability workshop is not limited to WG members, nor is it limited to W3C member companies. If you are interested in bringing an implementation to the interop, please don't hesitate to drop me a note and I can provide you with all the specifics. Of course, there is still much to be done. The WG is working on the next drafts of its Primer and Assertion Author Guidelines documents, as well as the WSDL1.1 Element Identifiers specification.

See also: the W3C news item


Windows Workflow Foundation and BPEL
David Chappell, Blog

Two things are worth pointing out about Microsoft's recent announcement of BPEL support in Windows Workflow Foundation (WF). First, it's not a surprise. The company has been talking about its intent to do this since WF went public in the fall of 2005. The only real surprise is that it's taking so long. This delay is probably indicative of the second point, which is that no one should interpret the announcement as an embrace of BPEL-based development by Microsoft. True, WF's BPEL activities will let developers create workflows that can be directly exported as standard BPEL. But the developer sees those workflows in the usual WF way, i.e., as .NET-based code, rather than as XML-based BPEL. Similarly, any imported BPEL workflows will be converted into WF's internal representation. Like BizTalk Server today, WF treats BPEL as a way to move process logic between different workflow engines, not as an executable format (and certainly not as a development language). If the popularity of BPEL in BizTalk is any indication, we shouldn't expect widespread use of WF's BPEL support. I very rarely run across organizations that are using BPEL with BizTalk Server today, and I remain skeptical about BPEL ever achieving widespread popularity. Adding the ability to export and import BPEL workflows to WF—and thus to Windows itself—will help WF in situations where support for BPEL is a political necessity. Yet I'll be surprised if it becomes a widely used aspect of WF applications.

See also: the OASIS WSBPEL TC


Microsoft: Make Our HD Photo Format a Standard
Stephen Shankland, ZDNet News

If you love something, set it free. Such is the reasoning behind a step Microsoft plans to announce Thursday: it will submit its HD Photo image format to a standards body. Making HD Photos a neutral industry standard, not just a Microsoft technology, is a significant step in the company's ambitious plan to establish a higher-quality replacement for today's ubiquitous JPEG standard. "Microsoft...intends to standardize the technology and will be submitting HD Photo to an appropriate standards organization shortly," the company said in a statement. The company plans to announce the move Thursday at the Photo Marketing Association trade show in Las Vegas. Microsoft isn't commenting on its motives, but the standardization move follows earlier lowering of barriers. In November, it liberalized the licensing policy—dropping fees, for example. At that time, it adopted the neutral HD Photo name instead of the Microsoft-centric Windows Media Photo, though Windows Vista uses the older name. And the company has said HD Photo technology is covered by the Open Specification Promise, an agreement under which Microsoft pledges not to assert its patent rights, which makes it more palatable to potential rivals—in particular open-source programmers. Standards bodies can be a mixed blessing for technology companies. On the one hand, they can build broad industry support for a technology, enabling different companies' products to work better together. Ideally, standards rise above a particular company's agenda to reflect the needs and experience of several companies. On the other hand, the consortia that create and approve standards are notoriously sluggish, especially when compared to the fast-moving computer industry, as Josh Weisberg, Microsoft's director of digital imaging evangelism, observed in January. And standards efforts aren't immune to competitive jockeying: Microsoft has faced obstacles trying to standardize its OOXML office document file formats. Microsoft has put years of research into HD Photo and knows it has years more work to create a JPEG alternative, much less replacement. The company knows it has to woo partners from every corner of the industry, including camera makers and those who build photo printing kiosks.

See also: the announcement


Rights in the PREMIS Data Model
Karen Coyle, Report for the U.S. Library of Congress

The PREMIS standard contains a rights entity that allows the association of rights with specific digital preservation actions. This paper looks at the various definitions of rights, the state of rights metadata, and surveys legislative actions taking place in many nations that will provide a legal standing for digital preservation activities. The Preservation Metadata: Implementation Strategies (PREMIS) Working Group developed the Data Dictionary for Preservation Metadata, which is a specification containing a set of "core" preservation metadata elements that has broad applicability within the digital preservation community. It constructed a data model that defined entities involved in the preservation process and their relationships. One of the important entities in this data model is rights statements, which specify terms and conditions for using the objects in a preservation repository. The PREMIS Working Group chose to consider only rights required for preservation activities in scope for its work, rather than rights for access. In order to make progress, the Group included minimal metadata that a repository needs to know about the rights to preserve digital objects. Rights in PREMIS take the form of structured permission statements, which are defined in terms of preservation actions. The group felt that, as the laws were clarified in terms of preservation rights and permissions were better understood, that this section of the PREMIS data dictionary could be expanded.

See also: XML and DRM


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2007-03-12.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org