A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover
This issue of XML Daily Newslink is sponsored by:
ISIS Papyrus http://www.isis-papyrus.com
Headlines
W3C Rule Interchange Format (RIF) Advances to Proposed Recommendation
Staff, W3C Announcement
Members of the W3C Rule Interchange Format (RIF) Working Group have published six Proposed Recommendation RIF specifications. Together, they allow systems using a variety of rule languages and rule-based technologies to interoperate with each other and with Semantic Web technologies. Public review is invited through June 08, 1020. The RIF Working Group has also published a new version of "RIF Test Cases", and three other Working Drafts: "RIF Overview", "RIF Combination with XML Data", and "OWL 2 RL in RIF". RIF implementation information is also available, including thirteen implementation reports received by the WG.
Three of the drafts define XML formats with formal semantics for storing and transmitting rules: (1) The "RIF Production Rule Dialect (PRD)" is designed for the kinds of rules used in modern Business Rule Management systems. (2) The "RIF Basic Logic Dialect (BLD)" is a foundation for Logic Programming, classical logic, and related formalisms. (3) The "RIF Core Dialect" is the common subset of PRD and BLD, useful when having a ubiquitous platform is paramount. The other drafts include: (4) "RIF Datatypes and Builtins (DTB)" specifies the datatypes and standard operations (modeled on XPath Functions) available in all RIF dialects (5) "RIF RDF and OWL Compatibility" specifies how RIF works with RDF, RDFS, OWL 1, and OWL 2. (6) "RIF Framework for Logic Dialects (FLD)" provides a mechanism for specifying extended dialects, beyond BLD, when more expressive power is required...
The approach taken by the Working Group was to design a family of languages, called dialects, with rigorously specified syntax and semantics. The family of RIF dialects is intended to be uniform and extensible. RIF uniformity means that dialects are expected to share as much as possible of the existing syntactic and semantic apparatus. Extensibility here means that it should be possible for motivated experts to define a new RIF dialect as a syntactic extension to an existing RIF dialect, with new elements corresponding to desired additional functionality. These new RIF dialects would be non-standard when defined, but might eventually become standards. Because of the emphasis on rigor, the word format in the name of RIF is somewhat of an understatement. RIF in fact provides more than just a format. However, the concept of format is essential to the way RIF is intended to be used. Ultimately, the medium of exchange between different rule systems is XML, a format for data exchange.
Recognizing that RIF rules should be able to interface with RDF and OWL ontologies, the RIF Working Group has also defined the necessary concepts to ensure compatibility of RIF with RDF and OWL. RIF, RDF, and OWL are exchange languages with dissimilar syntaxes and semantics. How, then, should RIF rules refer to RDF and OWL facts, and what is the logical meaning of the overall language? RIF-RDF and OWL Compatibility defines just that. The basic idea is that RIF uses its frame syntax to communicate with RDF/OWL. These frames are mapped onto RDF triples and a joint semantics is defined for the combination..."
See also: the W3C RIF Working Group
Google Coding Tool Advances Cloud Computing
Stephen Shankland, CNET News.com
Google has released a programming tool to help move its Native Client project — and more broadly, its cloud-computing ambitions — from abstract idea to practical reality. The new Native Client software developer kit, though only a developer preview version, is designed to make it easier for programmers to use the Net giant's browser- boosting Native Client technology...
To let people download Native Client modules from Web pages without security problems, NaCl prohibits various operations and confines NaCl program modules to a sandbox with restricted privileges. NaCl lets programmers write in a variety of languages, and a special compiler converts their work into the NaCl modules. The ultimate promise of NaCl is that Web-based applications could run much faster than those of today that typically use JavaScript or Adobe Systems' Flash. If Google can attract developers, the Web and cloud computing could become a much more powerful foundation for programs. The NPAPI Pepper project and NaCl SDK show one of Google's biggest challenges in bringing its cloud-computing vision to reality, though: getting others to come along for the ride. To make NaCl real, it must convince programmers to use the software, convince browser makers to include it or at least support it as a plug-in, and convince the general public to upgrade their browsers to use it..."
From the Blog article: "Today, we're happy to make available a developer preview of the Native Client SDK — an important first step in making Native Client more accessible as a tool for developing real web applications. When we released the research version of Native Client a year ago, we offered a snapshot of our source tree that developers could download and tinker with, but the download was big and cumbersome to use. The Native Client SDK preview, in contrast, includes just the basics you need to get started writing an app in minutes: a GCC-based compiler for creating x86-32 or x86-64 binaries from C or C++ source code, ports of popular open source projects like zlib, Lua, and libjpeg, and a few samples that will help you get you started developing with the NPAPI Pepper Extensions. Taken together, the SDK lets you write C/C++ code that works seamlessly in Chromium and gives you access to powerful APIs to build your web app...
To get started with the SDK preview, grab a copy of the download at code.google.com/p/nativeclient-sdk. You'll also need a recent build of Chromium started with the '--enable-nacl' command-line flag to test the samples and your apps. Because the SDK relies on NPAPI Pepper extensions that are currently only available in Chromium, the SDK won't work with the Native Client browser plug-ins..."
See also: the Blog article
Updated Atom Link Extensions Specification Supports Hash Attributes
James M. Snell (ed), IETF Internet Draft
An updated version of the Atom Link Extensions specification has been published, adding support for several hash attributes. Broadly, the specification adds additional attribute to the Atom Syndication Format (RFC 4287) link and content elements that may be used to express additional metadata about linked resources.
Computing Hash Digests: "When the resource referenced by 'atom:link' or 'atom:content' elements is retrievable using HTTP, hash digest values are computed by first performing an HTTP GET request on the URL specified by the '@href' or '@src' attributes, extracting the returned entity-body, then following the steps specified in Section 14.15 of RFC 2616. It should be noted, however, that there are a variety of factors that influence whether the entity-body returned by the HTTP GET will yield a hash digest value matching that specified by a hash attribute contained by the 'atom:link' or 'atom:content' elements. Accordingly, hash attribute values must be considered to be strictly advisory and cannot be used reliably as an end-to-end integrity check." Support is now provided for several hash digests, via the 'md5' attribute, the 'sha1' attribute, the 'sha224' attribute, the 'sha256' attribute, the 'sha384' attribute, the 'sha512' attribute... The 'md5' Attribute specifies a MD5 digest RFC 1864 of the resource identified by the 'atom:link/@href' or atom:content/@src attributes. The value is represented as a sequence of 32 hexadecimal digits. The 'md5' attribute MAY appear as a child of the 'atom:link' and 'atom:content' elements.
The 'etag' Attribute specifies an Entity Tag (RFC 2616) for the resource identified by the atom:link or atom:content element. The 'etag' attribute MAY appear as a child of the atom:link and atom: content elements... Note that HTTP defines the Entity Tag production such that quotes are significant. For example, the values "W/xyzzy" and W/"xyzzy" represent two distinctly different Entity Tags, the former being considered a "strong" entity tag, the latter a "weak" entity tag. The etag attribute value MUST include the appropriate double quotation marks.
The 'modified' Attribute specifies the date and time when the resource identified by the atom:link or atom:content element was last modified. The value must conform to the "date-time" production defined by RFC 3339. An uppercase "T" character must be used to separate date and time, and an uppercase "Z" character MUST be present in the absence of a numeric time zone offset. The 'modified' attribute MAY appear as a child of the atom:link and atom:content elements. I.e., 'modified = attribute modified { xsd:dateTime }'...
See also: the editor's blog article for revision -03
Microsoft Releases Office 2010
Paul McDougall, InformationWeek
Microsoft has formally launched the newest version of its Office productivity suite with an eye to protecting its turf on the desktop while forging ahead on the Internet, where the company is battling Google for the increasingly important Web-based computing market. Microsoft Office 2010 includes updated versions of traditional desktop applications like Word, Excel, and PowerPoint, as well as Internet versions of the apps that Microsoft has labeled Office Web. Office Web apps are free for consumers and also available at no charge to Microsoft's corporate software subscribers...
Along with Office 2010, Microsoft also released the SharePoint 2010 collaboration suite. Office 2010 offers a number of enhancements over its predecessor, Office 2007. It includes new broadcast capabilities for PowerPoint, an auto-preview function in Word, and a new feature in Excel, dubbed Sparklines, that generates trend visualization graphics. The Office Web apps, meanwhile, offer collaboration tools that let users share documents over Internet. Microsoft needs to expand its presence in Web-based, or cloud, computing, as rival Google is making strides with its online Google Apps suite. Even though administrators at Yale and the University of California, Davis, recently decided Google Apps wasn't for them, the product is being used by an increasing number of budget-conscious educational institutions and government agencies..."
From the Microsoft announcement: "Effective today, Office Mobile 2010 will be available for free via Windows® Phone Marketplace for all Windows Mobile 6.5 phones with a previous version of Office Mobile. People using Office Mobile 2010 can perform lightweight editing of Office documents and take notes on the go. With Office Mobile, people can work with Office documents stored on their phone, attached to an e-mail, and can browse, edit, and update documents stored on a Microsoft SharePoint 2010 site.... Microsoft's signature productivity technologies are available in the cloud, offering unprecedented choice and flexibility for IT departments when purchasing and deploying solutions. Office Web Apps will now be available to all Office volume licensing customers. In addition, customers will be able to purchase a subscription to Office Web Apps as part of Microsoft Online Services, Microsoft's cloud-based applications...
SharePoint 2010 delivers on the promise of flexible deployment options: Use Sandboxed Solutions to limit code central processing unit time, Microsoft SQL Server execution time, and exception handling. Plus, use these same technologies to deploy custom code to SharePoint Online. Through design integration with Visual Studio 2010, developers can use familiar application development tools to create, package and debug SharePoint solutions. It also includes rich application programming interfaces and support for Open XML, Microsoft Silverlight, Representational State Transfer (REST) and Language-Integrated Query (LINQ), which help developers build applications quickly. Developers can also build applications that connect to line-of-business data, use custom workflows, and provide business intelligence data and dashboards to an entire organization..."
See also: the Microsoft announcement
Securing Elasticity in the Cloud
Dustin Owens, ACM Queue Distributed Computing
"Elasticity, in my very humble opinion, is the true golden nugget of cloud computing and what makes the entire concept extraordinarily evolutionary, if not revolutionary. NIST's definition of elasticity is as follows: 'Capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.' When elasticity is combined with on-demand self-service capabilities it could truly become a game-changing force for IT...
It has also become clear that several monumental security challenges — not to mention many monumental nonsecurity-related challenges, not least of which are full functionality availability and how well an organization's environment is prepared to operate in a distributed model — now come into play and will need to be addressed in order for the elasticity element of cloud computing to reach its full potential. Most of the dialogue I am engaged in with customers today and that I see in publicized form, however, is simplistically centered on security challenges with IT outsourcing in general. These are challenges that have existed for some time in the predecessor models mentioned earlier: who within an outsourcer is able to access a customer's data, perimeter security considerations when outsourcing, DOS/DDOS (denial of service/ distributed denial of service), resource starvation, and compliance challenges with where data is stored or backed up.
Cloud providers will need to be prepared to account for and show how their particular services are able to control vulnerabilities such as the earlier example and keep similar vulnerabilities yet to be discovered from having devastating impacts on their customers. Perhaps more importantly, critical infrastructure could be subject to insurmountable risk and/or loss of sensitive information if providers lack the necessary controls. As services offered from the cloud continue to mature and expand, the threat posed is not limited to unauthorized information access but may include any cloud-provided computing systems (i.e., virtual servers, virtual desktops, etc.).
The promise of what elastic cloud-computing could do for the IT world, however, is extremely invigorating and certainly worth pursuing. It can only be hoped that organizations already taking this path or seriously considering doing so will take the time to fully appreciate the security challenges facing them and whether or not adoption at this point fits into their risk tolerance. Certainly, keeping these and other security challenges in mind while assessing how a prospective cloud provider can address these concerns (and at what cost and with what deployment constraints) should be a critical business objective..."
Security Technology: Worst-Case Thinking
Bruce Schneier, Security and Security Technology Blog
"At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.
I didn't get to give my answer until the afternoon, which was: 'My nightmare scenario is that people keep talking about their nightmare scenarios.' There's a certain blindness that comes from worst-case thinking. An extension of the precautionary principle, it involves imagining the worst possible outcome and then acting as if it were a certainty. It substitutes imagination for thinking, speculation for risk analysis, and fear for reason. It fosters powerlessness and vulnerability and magnifies social paralysis. And it makes us more vulnerable to the effects of terrorism.
Worst-case thinking means generally bad decision making for several reasons. First, it's only half of the cost-benefit equation. Every decision has costs and benefits, risks and rewards. By speculating about what can possibly go wrong, and then acting as if that is likely to happen, worst-case thinking focuses only on the extreme but improbable risks and does a poor job at assessing outcomes. Second, it's based on flawed logic. It begs the question by assuming that a proponent of an action must prove that the nightmare scenario is impossible. Third, it can be used to support any position or its opposite. If we build a nuclear power plant, it could melt down. If we don't build it, we will run short of power and society will collapse into anarchy...
When someone is proposing a change, the onus should be on them to justify it over the status quo. But worst-case thinking is a way of looking at the world that exaggerates the rare and unusual and gives the rare much more credence than it deserves. It isn't really a principle; it's a cheap trick to justify what you already believe. It lets lazy or biased people make what seem to be cogent arguments without understanding the whole issue. And when people don't need to refute counterarguments, there's no point in listening to them..."
Sponsors
XML Daily Newslink and Cover Pages sponsored by:
IBM Corporation | http://www.ibm.com |
ISIS Papyrus | http://www.isis-papyrus.com |
Microsoft Corporation | http://www.microsoft.com |
Oracle Corporation | http://www.oracle.com |
Primeton | http://www.primeton.com |
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/