The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: January 12, 2010
XML Daily Newslink. Tuesday, 12 January 2010

A Cover Pages Publication
Provided by OASIS and Sponsor Members
Edited by Robin Cover

This issue of XML Daily Newslink is sponsored by:
Sun Microsystems, Inc.

What CMIS Will Do for Content Integration
Andrew Conry-Murray, Intelligent Enterprise

"In a recent InformationWeek Analytics/Intelligent Enterprise Enterprise Content Management survey of 276 business IT professionals, 59% of respondents said their ECM systems could play an expanded role in the enterprise—if they could be more easily integrated with third-party applications. That's a big if. Linking apps to an ECM platform is expensive, involving extensive internal development or third-party specialists. That limits the number of apps—and users—that can take advantage of the content stored in these platforms.

The Content Management Integration Services (CMIS) standard [is] a Web services specification that lets competing content management systems share information. Backed by leading vendors, including Alfresco, EMC, IBM, Microsoft, Open Text, Oracle and SAP, CMIS aims to pry open proprietary ECM repositories, thus making the content stored in those repositories more available to applications and end users. For instance, enterprise search software or collaboration platforms could use the standard to find and share content. Web 2.0 mashups could be written to pull information from multiple repositories using CMIS as a common interface, instead of creating custom connections...

For companies that have multiple ECM platforms deployed, CMIS can serve as a bridge between apps and those disparate repositories. And according to our survey, 58% of organizations have two to three ECM platforms in house already, while 18% have four or more... CMIS can spur development of new apps across a variety of verticals. For instance, as the healthcare industry digitizes patient records and stores as digital files more diagnostic information, such as MRI scans and X-rays, the need to manage that content and make it accessible to the right person at the right time becomes critical. This confluence of applications and content repositories goes a long way toward explaining the makeup of the major backers of CMIS...

[Note: the results of the survey are available in a free online report 'Silo to Gold Mine: What CMIS Can, and Can't, Do for ECM Integration'.]

See also: CMIS references

Taxonomy of Cloud Computing Benefits
William Vambenepe, Blog - IT Management in a Changing IT World

"One of the heavily discussed Cloud topics in early 2009 was a Cloud Computing taxonomy. Now that this theme has died down (with limited results), and to start 2010 in a similar form, here is a proposal for a taxonomy of the benefits of Cloud Computing... Just like the original Cloud Computing taxonomy only had three layers (IaaS/PaaS/SaaS), so does this taxonomy of Cloud benefits. The point of this blog post is to promote the third layer. I describe layers 1 and 2 mainly to better call out what's specific about layer 3...

Layer 1 (infrastructure: 'let someone else do it'). This is the bare-bottom, inherent benefit of Cloud Computing: you don't have to deal with the hardware. In practice, it means: no need to worry about power/cooling, on-demand provisioning/deprovisioning, economies of scale... Layer 2 (management: 'let a program do it'). More specifically, more automated IT management. This does not require Cloud Computing (you can have a highly automated IT management environment on premise), but the move to Cloud Computing is the trigger that is making it really happen. While this capability is not an inherent benefit of Cloud Computing, the Cloud makes it needed, easier, and more beneficial...

Layer 3 (applications: 'do it right'). In short, use the move to the Cloud as an opportunity to fix some of the key issues of today's applications. Think of the Cloud switch as a second Y2K, 10 years later: like in 2000, not only are there things that the transition requires you to fix, there are also many things that aren't exactly required to fix but still make sense to fix as part of the larger modernization effort. Of course the Cloud move is missing that ever-so-valuable project management motivator of a firm deadline, but hopefully competitive pressure can play a similar role.

What are these issues? Here is a partial list: (1) Security: at least authentication and authorization. We have SSO/Federation systems, both enterprise-type and Web-centric and they often suck in practice. Whether it's because of the protocols, the implementations, the tools or the mindset. Plus, there are too many of them. As applications gained mouths and ears and started to communicate with one another, the problem became obvious. If, in the Cloud, you also want them to grow legs and be able to move around (wholly or in parts) then it really really has to get fixed. (2) Get remote application interfaces right. It's been discussed, manifesto'ed, buried and lampooned many times before.. Call it SOAP, zenSOAP, REST, practical REST or whatever you want. Just make sure that all important functions and data are accessible via clear, documented, consistent, easy-to-use, on-the-wire interfaces. Once we have these interfaces, and only then, we can worry about reliably composing/orchestrating applications that cross organizational boundaries. (3) Clean up the incestuous relationship between an application and its data. (4) Deliver application-centric IT management. (5) Fault-tolerance and disaster recovery. It is too often lacking (or untested, which is the same) for applications that are just below the perceived threshold of requiring it to be done right. That threshold needs to be lowered and the move to the Cloud can be used to make this possible..."

See also: Brenda Michelson on Enduring Aspects of Cloud Computing

Transition from XSLT 1.0 to XSLT 2.0
Kurt Cagle,

"Not surprisingly, adoption of XSLT has remained very limited among traditional developers. The year 2007 saw the release of what was, for all intents and purposes, the second generation of the XML core technologies. The first generation appeared over the course of about three years, from 1998 for the XML standard itself and 1999 for XPath and XSLT 1.0 to 2001 for the release of XML Schema. XSLT 1.0 in particular was a game changer for the technology, as it took a radically different approach to programming—creating a language written using nothing but markup that attempted to match XPath patterns and then, passing the XML nodes in question to a template to create new content.

This approach was extraordinarily powerful—because of the recursive nature of the templates, an XSLT stylesheet could transform anything from nearly flat database records to very deep documents with equal ease, could use wildcard matches to create generalized templates, and could additionally invoke functions on the XPath for more specialized processing. This meant that XSLT stylesheets began to take on the status of a secret weapon for XML developers, programs that could, in a few hundred lines of code, outdo imperative code double or triple its size...

Even among XML developers who recognized the value of XSLT, the language had a reputation for being cumbersome, and a number of efforts emerged to improve the standard via a somewhat vague and amorphous extension mechanism. [These issues] led to a decision by the W3C to go back to the drawing board, especially as there had been an increasing call by XML database vendors to provide a more comprehensive standardized query language that would do for XML what SQL did for relational databases -- create a unified query language that could be used to query large datasets, return results, and manipulate them in different ways to generate new output. In 2002, the XPath working group was given a mandate to produce a 2.0 version of the language, one that could be used by both a revised XSLT language and by the proposed XQuery language ... and thus began a major rethinking about data models..."

See also: the W3C XSLT Version 2.0 Recommendation

IETF Creates Smart Power Directorate to Review Internet Grid Protocols
Russ Housley, IETF Announcement

In a posting to the IETF-Announce mailing list, IETF General Area Director Russ Housley reported on the creation of a new Smart Power Directorate. The Smart Power Directorate will provide review and coordination on the use of Internet protocols to provide smart grid communications.

According to the published charter: "There are many initiatives to couple delivery of electricity with the data communications used to control and monitor the electricity generation, delivery infrastructure, and consuming devices. The U.S. Smart Grid is one such initiative. A smart grid includes an intelligent monitoring system that keeps track of all electricity flowing in the system, and it employs two-way communications to reduce cost, improve reliability, and increase transparency. Recognizing that a smart grid has many similarities to the "Internet of Things", developments associated with the use of Internet protocols to connect many devices without human users can be applied without enhancement to a smart grid...

The goal is to point other organizations to relevant IETF documents and provide review of documents from other organizations that depend upon Internet protocols. If the directorate identifies a gap that requires new work in the IETF, the directorate will raise the issue with the IESG. The directorate will not develop new protocols or enhance existing ones. This review and coordination will require expertise from all of the Areas in the IETF..."

Housley reports: "Fred Baker as the liaison to the US Smart Grid Interoperability Panel (SGIP) will play a significant role for the Directorate. In addition, I am still in the process of gathering people to represent each of the IETF Areas. I would like to see a non-IESG member loosely representing each IETF Area. Representatives have been identified for the General, Internet, and Routing Areas..."

See also: the IETF announcement

Cloud Security: Crypto Services and Data Security in Windows Azure
Jonathan Wiggs, MSDN Magazine

"In today's drive toward service-oriented architecture and solutions, few can consider doing business without cloud applications... The whole point of security and especially cryptography is to make your information and processes very hard to gain access to. We can define 'hard' as meaning that it is beyond the capability of any adversary to break into such a system for the duration of the life of that data or process. This is, however, a relative definition based on the requirements of the application or data being used. That is why [we] continued to emphasize the need of constant evaluation of security and cryptographic requirements...

The isolation of data and services in a multi-tenant environment such as Windows Azure is one of the major concerns of anyone who has an eye toward using private data. As with any new platform, security and cryptography features will continue to evolve in the Windows Azure platform. Microsoft has taken great pains to not only provide a secure, isolated environment, but also to expose what it has done to allow for public certification of these measures.

This article introduces some of the basic concepts of cryptography and related security within the Windows Azure platform. It makes heavy use of Cryptographic Service Providers (CSPs), which are implementations of cryptographic standards, algorithms and functions presented in a system program interface. For the purposes of this article we use the symmetric encryption algorithm provided by the Rijndael cryptography class... The Windows Azure SDK extends the core .NET libraries to allow the developer to integrate and make use of the services provided by Windows Azure. Access to the CSPs has not been restricted within Windows Azure projects and services. This means much of your development with regard to encrypting and decrypting data will remain the same with regards to the assemblies you're accustomed to using. However, there are changes in the underlying architecture, issues of when or where to encrypt data and where and how to persist keys...." [Note: the January 2010 issue of MSDN Magazine includes several articles on Cloud Computing.]

See also: the MSDN Magazine article on Cloud Storage

Rich Internet Applications Using ZK: An Open Source Ajax Framework
Sachin Mahajan, IBM developerWorks

"This article introduces ZK and provides a real-world example of its use running on Apache Tomcat and connecting to a MySQL database. ZK, an open source Asynchronous JavaScript + XML (Ajax) framework written in Java code, lets you write a Web 2.0-enabled, rich Internet application without writing a single line of JavaScript code. Typical Ajax frameworks like Dojo have JavaScript libraries that expose certain API's for making "Ajaxified" calls. ZK, on the other hand, uses a meta-definition based on XML to define the user interface. Translation to HTML code then occurs when this page is requested by the client...

You can think of ZK as being analogous to Ajax without JavaScript. It is composed of an Ajax-based, event-driven engine, a rich set of XHTML and XUL components, and a markup language called ZUML, which is used for creating feature-rich user interfaces. The business logic can be written through Java code directly integrated into your application, and which is triggered based on events or components. The most powerful feature of ZK is its rich set of control libraries for user interface development... Because of all this, ZK is becoming a popular choice for developing low-cost, rich Internet applications...

ZK is a direct Ajax implementation—or in other words, a server-centric model. This is unlike other frameworks that expose you to the painful details of making Ajax calls. Additionally, Ajax calls require extensive use and knowledge of JavaScript for manipulating the Document Object Model (DOM) on the browser (client) and synchronizing data during client/server communication. ZK shields you from these complexities and lets you focus on the business logic..."

See also: ZK resources

VMware Acquires Zimbra to Expand vCloud Collaboration Software Portfolio
Staff, VMware Announcement

"VMware, the global leader in virtualization solutions from the desktop through the datacenter and to the cloud, today announced that it has entered into a definitive agreement to acquire Zimbra, a leading vendor of email and collaboration software, from Yahoo! Inc. Under the terms of the agreement, VMware will purchase all Zimbra technology and intellectual property. Yahoo! will have the right to continue to utilize the Zimbra technology in its communications services, including Yahoo! Mail and Yahoo! Calendar. This acquisition will further VMware's mission of taking complexity out of the datacenter, desktop, application development and core IT services, and delivering a fundamentally more efficient and new approach to IT. Zimbra is a leading open source email and collaboration solution with over 55 million mailboxes. As an independent Yahoo! product division, Zimbra achieved 2009 mailbox growth of 86% overall and 165% among small and medium business customers.

Based on a modern, flexible architecture designed for virtualization and cloud-scale infrastructure, the Zimbra technology provides substantially lower total cost of ownership than traditional solutions. Zimbra products offer a full enterprise feature set, excellent interoperability with legacy email environments and have been deployed across small and large environments; as on-premise software at thousands of small and medium businesses, distributed enterprises, and as a hosted service at major service providers such as Comcast and NTT Communications..."

Brian Byun, Vice President and General Manager, Cloud Services at VMware: "Over the coming years, we expect more organizations, especially small and medium size businesses, to increasingly buy core IT solutions that deliver cloud-like simplicity in end-user and operational experience. Zimbra is a great example of the type of scalable 'cloud era' solutions that can span smaller, on-premise implementations to the cloud. It will be a building block in an expanding portfolio of solutions that can be offered as a virtual appliance or by a cloud service provider.

See also: Darryl Taft's eWEEK article

Dino Dai Zovi: Chrome Sets Browser Security Standard
Gregg Keizer, ComputerWorld

"All browser makers should take a page from Google's Chrome and isolate untrusted data from the rest of the operating system, a noted security researcher said: Dino Dai Zovi, a security researcher and co-author of The Mac Hacker's Handbook, believes that the future of security relies on 'sandboxing,' the practice of separating application processes from other applications, the operating system and user data... He sees browser sandboxing as an answer to the flood of exploits that have overwhelmed users in the past year.

In a blog article, Dai Zovi described sandboxing, as well as the lesser security technique of 'privilege reduction,' as 'moving the bull (untrusted data) from the china shop (your data) to the outside where it belongs (a sandbox).' The idea behind sandboxing is to make it harder for attackers to get their malicious software onto machines. Even if an attacker was able to exploit a browser vulnerability and execute malware, he would still have to exploit another vulnerability in the sandbox technology to break into the operating system and, thus, get to the user's data...

Chrome has included sandboxing since its September 2008 debut. And while Dai Zovi considers it easily the leader in security because of that, other browser have, or will, make their own stabs at reducing users' risks. For example, Microsoft's Internet Explorer 7 (IE7) and IE8 on Vista and Windows 7 include a feature dubbed 'Protected Mode,' which reduces the privileges of the application so that it's difficult for attackers to write, alter or destroy data on the machine, or to install malware. But it's not a true sandbox as far as Dai Zovi is concerned. Currently, Mozilla's Firefox, Apple's Safari and Opera Software's Opera lack any sandboxing or privilege reduction features..."

See also: Ian Hickson's MIME-type proposal for 'text/sandboxed-html'


XML Daily Newslink and Cover Pages sponsored by:

IBM Corporation
Microsoft Corporation
Oracle Corporation
Sun Microsystems, Inc.

XML Daily Newslink:
Newsletter Archive:
Newsletter subscribe:
Newsletter unsubscribe:
Newsletter help:
Cover Pages:

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: