The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: February 18, 2010
XML Daily Newslink. Thursday, 18 February 2010

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com



OASIS Presents Ground Floor Briefing on WS-Calendar Technical Activity
Staff, OASIS Announcement

WS-Calendar is a new effort to formalize and standardize the communication of schedule and interval information for Web services. This free webinar on Friday, February 19, 2010 11:00 AM - 12:00 PM EST will provide background and discussion for all those considering participation in the OASIS WS-Calendar Technical Committee. WS-Calendar will address needs common to the enterprise, to facilities operations, to financial instruments, and to Smart Grid operations. The webinar will explain the advantages of defining a standard information model for use in Web services communications within and between domains.

Status of recent efforts to update iCalendar and related standards (iTIP and iMIP) within the IETF will be provided. A description of ongoing work to standardize the XML representations not only of iCalendar, but also of Calendar-to-Calendar interactions will be included as well. The webinar will feature a discussion of the anticipated uses of WS-Calendar within other standards efforts, including WSDM, BPEL, EDXL, and oBIX. We will also explore scenarios in Smart Grid, social networking, and in the operation of electric vehicles.

Anyone considering participation in the OASIS WS-Calendar Technical Committee, including those involved in domains where the common exchange of scheduling information is critical, such as energy transmission, business process applications, emergency management, and smart buildings should register to attend.

Webinar presenters include: (1) Dave Thewlis, Executive Director of CalConnect, the Calendaring and Scheduling Consortium, and convener of the OASIS WS-Calendar Technical Committee; (2) Toby Considine, chair of the OASIS oBIX Technical Committee, member of the OASIS Energy Interoperation and the Energy Market Information Exchange (eMIX) Technical Committees.

Update 2010-02-23: A recording of the 19-February-2010 webinar is available, referenced from the OASIS Webinars page. Format: .wmv. Extent: 25.4 MB. Duration: 41 minutes.

See also: the WS-Calendar TC Overview


W3C Issues Last Call for Comments RIF Production Rules Dialect Revised
Christian de Sainte Marie, Gary Hallmark, Adrian Paschke (eds), W3C Technical Report

Members of the W3C Rule Interchange Format (RIF) Working Group have issued a new Last Call Working Draft for the specification RIF Production Rule Dialect. During the Candidation Recommendation implementation phase of the Rule Interchange Format (RIF), the Working Group discovered a problem with the design of the Production Rules Dialect. This problem is addressed with a new Last Call Working Draft that changes the way actions are handled to more closely match existing production rule engines. Comments and RIF implementation reports are now invited through March 05, 2010.

The "RIF Production Rule Dialect" specification defines the production rule dialect of the W3C rule interchange format (RIF-PRD), a standard XML serialization format for production rule languages. The production rule dialect is one of a set of rule interchange dialects that also includes the RIF Core dialect (RIF-Core) and the RIF basic logic dialect (RIF-BLD). RIF-Core, the core dialect of the W3C rule interchange format, is designed to support the interchange of definite Horn rules without function symbols ("Datalog").

RIF-Core is intended to be the common core of all RIF dialects, and it has been designed, in particular, to be a useful common subset of RIF-BLD and RIF-PRD. RIF-PRD includes and extends RIF-Core, and, therefore, RIF-PRD inherits all RIF-Core features. These features make RIF-PRD a Web-aware (even a semantic Web-aware) language. However, it should be kept in mind that RIF is designed to enable interoperability among rule languages in general, and its uses are not limited to the Web. This document targets designers and developers of RIF-PRD implementations. A RIF-PRD implementation is a software application that serializes production rules as RIF-PRD XML (producer application) and/or that deserializes RIF-PRD XML documents into production rules for consumer applications...

RIF has focused on exchange rather than trying to develop a single one-fits-all rule language because, in contrast to other Semantic Web standards, such as RDF, OWL, and SPARQL, it was immediately clear that a single language would not cover all popular paradigms of using rules for knowledge representation and business modeling. Even rule exchange alone was quickly recognized to be a daunting task. Known rule systems fall into three broad categories: first-order, logic-programming, and action rules. These paradigms share little in the way of syntax and semantics. Moreover, there are large differences between systems even within the same paradigm... The family of RIF dialects is intended to be uniform and extensible. RIF uniformity means that dialects are expected to share as much as possible of the existing syntactic and semantic apparatus

See also: the RIF Call for Implementations


Web 3.0: The Dawn of Semantic Search
James Hendler, IEEE Computer

Previous IEEE Computer columns have discussed "the status of the Semantic Web, and particularly of its applied use in Web applications, increasingly coming to be known as Web 3.0. I'm happy to say development and deployment continue apace, and that for those of us who know where to look, we see a lot of progress... One of the difficulties in explaining Web 3.0 is that, unlike the original Web browser or later Web 2.0 systems, Semantic Web technology tends to be an infrastructure technology. While Web companies are working to produce new and scalable tools, academic researchers are pushing the size and speed of Semantic Web back-end operations....

The first generation of enterprise Web 3.0 systems uses behind-the-scenes 'structural' semantics to extend their current capabilities— for example, taxonomies with simple properties that can be used to relate terms to each other or to integrate terminologies from multiple sites. This sort of 'controlled vocabulary' has been around for a long time, but emerging technologies allow it to be more easily integrated with Web development...

The most important area where we'll see these technologies on the Web is in the growing area of semantic search engines. These include systems that try to augment general searches as well as systems that are trying to literally change the search experience. While the internal details of most of these systems are still proprietary, in general they appear to combine a pragmatic approach to natural-language processing with a lightweight semantics that lets them better collect and process information about specific areas.

An important use of semantics in search is to draw on domain knowledge in areas where searches are difficult. The T2engine uses semantic technologies not only to find recipes from sites but also to filter them by several categories including cooking time, dietary options, and cuisine. Semantic search techniques that use domain knowledge clearly would change the search experience if widely deployed. However, it will take time and a combination of human and machine effort to cover the enormous diversity of Web domains. T2's solution to this problem is to provide the means for people to create these mappings using social, wiki-like mechanisms, thus extending the search engine's reach..."


Cloud-Based EMR Vendor Expects Flood of New Hires
Marianne Kolbasuk McGee, Informationweek

"Practice Fusion is the provider of free, web-based e-medical records doing lots of hiring these days, adding about two or three people a week, and expecting to hire about 100 new employees this year—which in total will more than triple the company's current headcount of 40, according to CEO Ryan Howard. Practice Fusion's hiring is "across the board," from executives (chief operating officer) to account managers, marketing people, developers and support reps...

Practice Fusion is catching on with doctors because it's offered via the Internet—and especially because it's free... Many doctors in small offices don't want to worry about installing and maintaining software in their practices, and they don't want to lay out a lot of (any) money to begin digitizing their medical records... Practice Fusion has been signing up about 100 new doctors a day to its web-based services, and so far has about 29,000 users in 20,000 doctor practices across the country..."

See also: XML in Clinical Research and Healthcare


Constrained Application Protocol (CoAP) Requirements and Features
Zach Shelby, Michael Stuber, Don Sturek (et al, eds), IETF Internet Draft

IETF has published an initial -00 version of CoAP Requirements and Features, considering the requirements and resulting features needed for the design of the Constrained Application Protocol (CoAP). Starting from requirements for energy and building automation applications, the basic features are identified along with an analysis of possible realizations. The goal of the document is to provide a basis for protocol design and related discussion.

Introduction: "The use of web services on the Internet has become ubiquitous in most applications, and depends on the fundamental Representational State Transfer (REST) architecture of the web. The proposed Constrained RESTful Environments (CoRE) working group aims at realizing the REST architecture in a suitable form for the most constrained nodes (e.g. 8-bit microcontrollers with limited RAM and ROM) and networks (e.g. 6LoWPAN).

One of the main goals of CoRE is to design a generic RESTful protocol for the special requirements of this constrained environment, especially considering energy and building automation applications. The result of this work should be a Constrained Application Protocol (CoAP) which easily traslates to HTTP for integration with the web while meeting specialized requirements such as multicast support, very low overhead and simplicity.

CoAP must support the manipulation of simple resources on constrained nodes and networks. The architecture requires push, pull and a notify approach to manipulating resources. CoAP will be able to create, read, update and delete a Resource on a Device... It must define a mapping from CoAP to a HTTP REST API; this mapping will not depend on a specific application and must be as transparent as possible using standard protocol response and error codes where possible... The core CoAP functionality must operate well over UDP and UDP must be implemented on CoAP Devices. There may be optional functions in CoAP (e.g. delivery of larger chunks of data) which if implemented are implemented over TCP... A definition of how to use CoAP to advertise about or query for a Device's description. This description may include the device name and a list of its Resources, each with a URL, an interface description URI (pointing e.g. to a Web Application Description Language (WADL) document) and an optional name or identifier..."

See also: Smart Energy Requiements for 6LowApp


Securing HTTP State Management Information
Gonzalo Salgueiro and Paul Jones (eds), IETF Internet Draft

"This memo provides a simple method for providing a reasonable level of security when exchanging state management information through HTTP in situations where TLS is not employed.

In spite of the fact that we have HTTPS (HTTP over TLS) for securing communication between HTTP User Agents (i.e., web browsers) and web servers, there are many web applications and web sites that rely on insecure connections to exchange state management information in the form of HTTP URL parameters or cookies that could allow rogue entities to gain access to protected resources. Even in environments where secure connections are used for initially authenticating users, the sessions established and associated with the User Agent often use a simple cookie exchange over an insecure connection for subsequent information exchanges, thus securing only the user's password, but not the session itself. This allows HTTP sessions to be hijacked by any entity that can observe the state management information

One could use HTTPS everywhere on the Internet, but there are reasons why that is not always desired or preferred... In practice, the use of HTTPS requires a unique IP address per URL. Using HTTPS consumes more processing time and resources, an issue that is only compounded when there are several small transactions over separate connections. Using HTTPS on the Internet requires the purchase of digital certificates and, depending on one's environment, this can be costly. Installing and updating digital certificates takes time, thus increasing Total Cost of Ownership (TCO). Expired certificates drive visitors away in fear due to security warnings presented by web browsers... Encrypting the entire session prevents routers or other devices from efficiently compressing otherwise highly compressible plain ASCII text over low bit-rate links.

For one or more of these stated reasons, many web applications exchange state management information that should be secured over insecure connections. Therefore, application developers need a method of providing an acceptable level of security for selected state management information that does not require the use of HTTPS..."


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2010-02-18.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org