The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: April 01, 2008
XML Daily Newslink. Tuesday, 01 April 2008

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
BEA Systems, Inc. http://www.bea.com



Web Services for Remote Portlets (WSRP) Specification Version 2.0
Rich Thompson (ed), OASIS Committee Specification Approved as OS

OASIS announced that the membership has voted to approve Version 2.0 of the "Web Services for Remote Portlets Specification" as an OASIS Standard, updating the WSRP Version 1.0 OASIS Standard published in August 2003. The goal of the specification is to enable an application designer or administrator to pick from a rich choice of compliant remote content and application providers, and integrate them with just a few mouse clicks and no programming effort. The OASIS WSRP Technical Committee was chartered to standardize presentation-oriented Web services for use by aggregating intermediaries, such as portals. The TC members work to simplify the effort required of integrating applications to quickly exploit new web services as they become available. WSRP ayers on top of the existing web services stack, utilizing existing web services standards and will leverage emerging web service standards (such as policy) as they become available. The interfaces defined by this specification use the Web Services Description Language (WSDL). WSRP version 2 extends the Version 1.0 definitions to support more advanced use cases, providing: (1) coordination between components, (2) the ability to move customized portlets across registration and machine boundaries; (3) a mechanism for describing protocol extensions; (4) support for leasing of resources; (5) in-band means of getting resources; (6) aupport for the CCPP protocol (device characteristics). WSRP Version 2.0 consists of a prose specification that describes the web service interface that is exposed by all instances of compliant Producers as well as the semantics required both of the service and its Consumers, together with a WSRP version 2 XML schema, WSRP version 2 portTypes (WSDL), and WSRP version 2 bindings (WSDL).

See also: the announcement


Working Group Formed to Support ODRL Service (ODRL-S) Profile
G.R. Gangadharan and Renato Iannella (eds), Working Draft

On behalf of the ODRL Initiative, Renato Iannella announced the formation of a new ODRL Services (ODRL-S) Profile Working Group, chartered to develop the semantics for licensing Service-Oriented Computing (SOC) services. The Open Digital Rights Language (ODRL) Initiative is an international effort aimed at developing and promoting an open standard for rights expressions. ODRL is intended to provide flexible and interoperable mechanisms to support transparent and innovative use of digital content in publishing, distributing and consuming of digital media across all sectors and communities. The new profile will build upon prior work completed at the University of Trento on service licensing. The WG will develop an ODRL Profile that extends the ODRL language to support the SOC community requirements. The profile will address the core semantics for the licenses to enable services to be used, reused, and amalgamated with other services. By expressing the license terms in ODRL, greater features can be supported, such as automatically detecting conflicts in service conditions, and making explicit all requirements and conditions. ODRL-S is designed as a complementary language to describe licensing clauses of a service in machine interpretable form. The salient features of ODRL-S are as follows: (1) ODRL-S unambiguously represents service licensing clauses -- based on formalization of licensing clauses; (2) ODRL-S is simple yet powerful and fully extensible language; (3) ODRL-S can specify licenses at service level and service operation level; (4) ODRL-S can be used with any of existing service description standards and languages; (5) ODRL-S is developed as a completely compatible profile with ODRL for describing a service license.

See also: XML and DRM


W3C XML Query Working Group Invites Comment on XQuery Working Drafts
Staff, W3C Announcement

Members of the W3C XML Query Working Group have published two First Public Working Drafts: "XQuery Scripting Extension 1.0" and "XQuery Scripting Extension 1.0 Use Cases." The XQuery Scripting Extension 1.0 specification defines an extension to "XQuery 1.0: An XML Query Language" (W3C Recommendation") and "XQuery Update Facility 1.0 (W3C Candidate Recommendation). Expressions can be evaluated in a specific order, with later expressions seeing the effects of the expressions that came before them. This specification introduces the concept of a block with local variable declarations, as well as several new kinds of expressions, including assignment, while, continue, break, and exit expressions. The "Use Cases" document provides the usage scenarios that motivate the changes developed in the XQuery Scripting Extension (XQSE).

See also: the W3C XML Query web site


Finding the Right ID
Richard Adhikari, Redmond Developer News

As Microsoft looks to advance its interoperability initiative, CardSpace (the company's identity-management framework) promises to play a key role in providing authentication between Windows and .NET-based applications on the one end, and the Web, open source technology and other key enterprise software platforms on the other. Microsoft lowered a key barrier by adding support for the recently upgraded industry standard OpenID specification into its CardSpace client identity-management framework. Still, it could be some time before developers are called on to use OpenID and CardSpace for cross-platform enterprise applications. CardSpace is a key component of Microsoft's .NET Framework 3.5 and is supported in Internet Explorer 7 and Windows. It's built largely on Microsoft Windows Communication Foundation (WCF), serving as the identity provider. While OpenID provides single sign-on to social networking sites and blogs—letting users log in one time to employ a public persona across multiple sites—it's not robust enough to support government applications, casual Web surfing, financial transactions or private data access. Microsoft's Chief Identity Architect Kim Cameron has said in his Identity Weblog that the company is interested in OpenID as part of a spectrum of solutions. But Cameron has written that unlike redirection protocols such as SAML, WS-Federation and OpenID, CardSpace limits the amount of personal information users need to give out, making Web surfing more secure. Microsoft describes CardSpace as an identity selector—the user creates self-issued cards and associates a limited set of identity data with each. The CardSpace user interface is security-hardened, and the user decides what information will be provided.

See also: Windows CardSpace


Tim Berners-Lee and Distinguished Faculty to Present at LinkedData Planet
Ken North, Conference Announcement

Ken North has provided updated information about the summer LinkedData Planet Conference. Sir Tim Berners-Lee, Director of the W3C, will deliver a keynote and a distinguished faculty will deliver a content-rich technical program at in New York City (June 17-18, 2008). Besides the keynote, there will be a Linked Data Workshop and a Power Panel. The conference is co-chaired by Bob DuCharme and Ken North The evolution of the current Web of "linked documents" to a Web of "linked data" is steadily gaining mindshare among developers, architects, systems integrations, users, and the more than 200 software companies developing semantic web-oriented solutions. Organizations such as Adobe, Google, OpenLink Software, Oracle, the W3C, and the grassroots Linking Open Data community have actively provided technology and thought leadership during the embryonic stages of this evolutionary transition. Notable examples on the Web today include, DBpedia, the Zoominfo search engine, the Bambora travel recommendation site, a number of social networking sites, numerous semantic web technology-based services, various linked data browsers, SPARQL query language and protocol-compliant data servers and data management systems, and a growing number of web sites exposing machine-readable data using microformats, RDFa, and GRDDL. The LinkedData Planet audience will include system architects, enterprise architects, web site designers, software developers, consultants and technical managers, all looking to learn more about linking the growing collection of available data sources and technologies to get more value from their data for their organizations.

See also: the detailed program listing


Semantic Web in the News
Tim Berners-Lee, Blog

The Semantic Web has been in the news a bit recently. There was the buzz about Twine, a "Semantic Web company", getting another round of funding. Then, Yahoo announced that it will pick up Semantic Web information from the Web, and use it to enhance search... Text search engines are of course good for searching the text in documents, but the Semantic Web isn't text documents, it is data. It isn't obvious what the killer apps will be—there are many contenders. We know that the sort of query you do on data is different: the SPARQL standard defines a query protocol which allows application builders to query remote data stores. So that is one sort of query on data which is different from text search. One thing to always remember is that the Web of the future will have BOTH documents and data. The Semantic Web will not supersede the current Web. They will coexist. The techniques for searching and surfing the different aspects will be different but will connect. Text search engines don't have to go out of fashion... The Media Standards Trust is a group which has been working with the Web Science Research Initiative [...] to develop ways of encoding the standards of reporting a piece of information purports to meet: "This is an eye-witness report"; or "This photo has not been massaged apart from: cropping"; or "The author of the report has no commercial connection with any products described"; and so on. Like Creative Commons, which lets you mark your work with a licence, the project involves representing social dimensions of information. And it is another Semantic Web application. In all this Semantic Web news, though, the proof of the pudding is in the eating. The benefit of the Semantic Web is that data may be re-used in ways unexpected by the original publisher. That is the value added. So when a Semantic Web start-up either feeds data to others who reuse it in interesting ways, or itself uses data produced by others, then we start to see the value of each bit increased through the network effect.

See also: the W3C Semantic Web Activity


Ocean Scientists Embrace OGC Standards
Mark Reichardt, OGC President's Message

"The Earth's largest ecosystem, the ocean, is studied by specialists from a range of scientific disciplines. Despite the ocean's apparent vastness, human activities have had a profound effect on ocean systems, and in turn changes in the ocean system have comparably profound effects on the weather and climate. The ocean system indirectly determines human impacts from a growing list of societal activities—land development, agriculture, coastal development, sewage outflow, energy production and fishing, to name a few... The ocean science community is advancing a significant body of work to understand and address ocean-related issues. Their findings are important in efforts to strike a balance between protection of ocean systems and human exploitation of ocean resources. Given the magnitude and complexity of the issues, ocean research programs have much to gain by improving their ability to share ocean data, which almost always has spatial context. Not surprisingly, the oceans research community is aggressively implementing and using OGC standards to improve organizational, regional and global capabilities to access, process, integrate and apply ocean information, including real time sensor data. [This article provides a] partial list of ocean science programs and projects using OGC standards. Almost all of these efforts involve multiple government agencies, universities and research centers, and many of these programs and projects are working together... A major international program, known as the Global Earth Observing System of Systems (GEOSS), is being advanced by the Group on Earth Observations (GEO). Ocean observing and prediction is a major component of GEOSS. The OGC has contributed to GEOSS objectives through its involvement as a participating organization in GEO, through a series of GEOSS demonstrations conducted in partnership with IEEE Geoscience and Remote Sensing and ISPRS, and through the recent GEOSS Architecture Implementation Pilot, which has brought together technical contributions from over 120 organizations. The "GEOSS Report on Progress 2007" noted that the development of interoperability in the GEOSS was ahead of schedule. The work of OGC alliance partners is also important in addressing the interoperability needs of the ocean science community. The OASIS Common Alert Protocol (CAP) standard, for example, has elements that are harmonized with OGC standards, and CAP is growing in importance for issuing warning messages in emergency situations. It is being applied in some of the ocean science activities listed [here].

See also: OGC Standards


DMTF Chairman: New Possibilities in FY 2008
Mike Baskey, Management Matters Now

DMTF Chairman Mike Baskey provides an update on Distributed Management Task Force activities: "During the past year, we've continued to streamline the processes both within our organization and in our work with alliance partners. We are also developing a Conformance Program that will enable customers to test conformance with the set of standards that DMTF and our alliance partners are defining. Moreover, we expect to launch several key initiatives this fiscal year. In addition to the great work within the System Virtualization, Partitioning, and Clustering (SVPC) working group around models and profiles, we expect to publish the Open Virtualization Format ( OVF) specification for virtual appliances. Another DMTF initiative focuses on federation of CMDBs (configuration management databases); we expect a preliminary release of the CMDBf standard this year as well. The CMDBf work within DMTF will connect our organization to the Information Technology Infrastructure Library (ITIL) and related process management space to increase the relevance of the work we do in this area. A third DMTF initiative involves power and energy management and ties into our collaborative work with The Green Grid. This important development will improve energy efficiency in the data center, which has great social significance as we wrestle with the challenges in that domain... DMTF will also continue to make significant strides in the areas of server and desktop management -- particularly in the integration of Web services into those and other related device management initiatives. In addition, a greater degree of interoperability and conformance testing/certification will become a reality in this coming year—a very exciting milestone for our organization. We're also moving forward in getting more of the DMTF specifications submitted to the International Standards Organization (ISO), an increasingly important requirement as we expand our role in the world of international standards and our industry ecosystem..."

See also: DMTF Technical Committee reorganization


Effective, Agile, and Connective
David Burdett, SAP Info International

Composite applications built from predefined enterprise services form the core of enterprise service-oriented architecture (enterprise SOA). Ultimately the goal of enterprise SOA is composition of any service implemented on any technology by any business partner anywhere in the world. Open, standards-based technology is a key factor in achieving this level of interoperability—similar to plugging a telephone into the wall. Some of the standards needed relate to the technology used to implement enterprise SOA, while others define business semantics and the languages used to describe them... In enterprise SOA, business semantics consist of definitions of enterprise services and business processes. These definitions must be described in a manner that allows the technology layer of the architecture to use them to good effect. There are three types of definition languages, for processes, service interfaces, and message content. Process definition languages define the sequence and conditions in which the steps in a business process occur. With machine-readable definitions, a business process platform can ensure that the steps are followed correctly. The need for this ability is related to the way businesses work—reacting to an event with an activity. An event can be almost anything—contact with a customer or supplier or reception of an order or an invoice. Enterprises need a way to describe—clearly and unambiguously—how the events that occur relate to activities in the business. The most important standard for defining processes is Business Process Modeling Notation (BPMN). It provides a business-oriented, graphical way of identifying events and describing activities in easy-to-understand diagrams. Process definition is a critically important area for enterprise SOA, and BPMN delivers good business value... Message definition languages are used to define the structure and content of the data that an enterprise service sends, receives, or consumes. For example, they define that the same field always has the same name in all messages. The languages also describe how to combine fields into larger structures, how to specialize or extend fields and messages to meet specific needs, and how to represent the message as an XML schema, for example. [A] leading standard language for message definition is the UN/CEFACT Core Components Technical Specification (CCTS). UN/CEFACT is the organization that also developed the international version of EDI. CCTS provides a rigorous methodology for defining data unambiguously and includes rules about how to convert language-neutral definitions into XML. Clear, consistent definitions of the messages used by enterprise services deliver business value.

See also: Article Part 1


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
EDShttp://www.eds.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2008-04-01.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org