The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: October 02, 2008
XML Daily Newslink. Thursday, 02 October 2008

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Primeton http://www.primeton.com



Document Design Matters
Erik Wilde and Robert J. Glushko, CACM Preprint

This article was published in the Communications of the ACM (CACM) Volume 51 / Issue 10 (October 2008), pages 43-49. "The classical approach to the data aspect of system design distinguishes conceptual, logical, and physical models. Models of each type or level are governed by metamodels that specify the kinds of concepts and constraints that can be used by each model; in most cases metamodels are accompanied by languages for describing models. For example, in database design, conceptual models usually conform to the Entity-Relationship (ER) metamodel (or some extension of it), the logical model maps ER models to relational tables and introduces normalization, and the physical model handles implementation issues such as possible denormalizations in the context of a particular database schema language. In this modeling methodology, there is a single hierarchy of models that rests on the assumption that one data model spans all modeling levels and applies to all the applications in some domain. The "one true model" approach assumes homogeneity, but this does not work very well for the Web. The Web as a constantly growing ecosystem of heterogeneous data and services has challenged a number of practices and theories about the design of IT landscapes. Instead of being governed by "one true model" used by everyone, the underlying assumption of top-down design, Web data and services evolve in an uncoordinated fashion. As a result, a fundamental challenge with Web data and services is matching and mapping local and often partial models that not only are different models of the same application domain, but also differ, implicitly or explicitly, in their associated metamodels... This article has extended API design matters into the world of resource-orientation and REST as its currently most popular exponent. While function-oriented API design matters are mostly about the design of a concrete API, document design matters in the world of resource-orientation have a quite different emphasis. This emphasis is mostly on the models governing the design of an actual API; that is, the metamodels for these APIs. The design of REST APIs has more facets than only the selection of a particular metamodel for the representation of resources, but this is the most fundamental choice, and it directly affects the usability and accessibility of such an API... We hope that this critical gap for XML-based resource orientation will soon be filled by some yet-to-be-invented language, capable of representing conceptual models for XML. Such a language would make it possible to better describe exchange models for resource-oriented APIs by supporting an easy way of generating schemas (logical models) for defining the representation (the markup) of exchange models.

See also: the document from the ACM Digital Library


The Future of XForms
Philip Fennell, O'Reilly Broadcast

Some of the recent talk on the Mozilla XForms Project's mailing list has been about the winding-down in effort on the Mozilla XForms plug-in. There has been praise for the efforts of those developers involved in the project, and quite rightly so. However, some people may be seeing this as a bad sign for XForms in general... the effort to bring XForms to the wider web appears to be going into ubiquity-xforms. As a pure client-side, cross-platform/cross-browser implementation this, in my opinion, is where the effort should be going for now. It obviously lowers the barrier to adoption because there is no requirement for plug-ins, although it does require JavaScript to be enabled in your browser; but that is a much smaller hurdle for most people. Ultimately XForms needs to go the way of SVG and become embedded into browsers. The ASV plug-in was a flagship for SVG which provided a reference for other implementations; it showed that SVG was worth having and must have undoubtedly driven the desire to embedded this functionality in all but one of the modern browsers. The Mozilla XForms plug-in, and for that matter FormsPlayer too, has been at the vanguard of XForms development, performing the same role as the ASV plug-in of getting a technology out there for people to play with. Although at the time of writing the Mozilla plug-in is not a 100% complete implementation of the XForms 1.1, recommendation it is as good a reference as there is, due to its excellent integration with the host page and browser. I believe we're entering a transition phase where XForms plug-ins give way to other, more universal, implementations and in the fullness of time the functionality becomes embedded. This change of emphasis is good for the future of XForms and good for the future of the web too...

See also: XML and Forms


New AJAX Support For Data-Driven Web Apps
Bertrand Le Roy, MSDN Magazine

AJAX is an exciting Web platform for many reasons. Using AJAX, many tasks that were traditionally performed on the server happen in the browser instead, resulting in fewer round-trips to the server, lower bandwidth consumption, and faster, more responsive Web UIs. While these outcomes are the result of offloading a good deal of work to the client, the browser still isn't the environment of choice for many developers who would rather have the full power and flexibility of server apps at their disposal. The solution employed until now has involved the UpdatePanel control, which has allowed developers to build AJAX applications while still retaining the full array of server tools. But UpdatePanel carries a lot of weight from the traditional postback model: an UpdatePanel request is still a full postback. In fact, using UpdatePanel, the whole form (including ViewState) is posted to the server, almost the entire page lifecycle gets executed there, and the rendering still happens on the server. Obviously, this method defeats one of the main reasons for moving to AJAX. The only real savings here are that XmlHttpRequest is used instead of a regular HTTP POST request and only updated parts of the page and the ViewState are sent back to the client. Thus, the response is much smaller, but the request isn't. A pure AJAX approach will almost always perform better than the UpdatePanel approach. In a purely AJAX solution, the rendering happens on the client and the server sends back only data, which is usually much smaller than the equivalent HTML. This approach can also substantially reduce the number of network requests: having the data on the client allows much of the application's UI logic to run in the browser. The main problem with the pure AJAX approach, however, is that the browser lacks the tools to turn data into HTML. Out of the box, it has only two crude facilities for doing so: 'innerHTML', which replaces the full contents of an element with the HTML string you provide, and the somewhat slower Document Object Model (DOM) APIs that operate on tags and attributes (similar in terms of abstraction level to HtmlTextWriter). In this article, I'll show three iterations of a page written with classic postback, then with UpdatePanel, and then using pure AJAX to illustrate how techniques employed on the server can sometimes perform better on the client. The first two examples can be built today with the publicly available ASP.NET 3.5 SP1, while the third version will use some of the new client features in ASP.NET 4.0.


Deploy an SCA Application Using the Tuscany Domain Manager
Alla Segal, Michael Beisiegel, J.-S. Delfino; IBM developerWorks

Service Component Architecture (SCA) lets you develop and assemble Service-Oriented Architecture (SOA) solutions that are comprised of independent components, regardless of their implementation and environment. SCA is a major SOA initiative and is becoming an OASIS standard. Apache Tuscany provides an easy-to-use open source infrastructure for the development and operation of SCA applications. The components in an SCA composite application can run on different nodes in a network. In Apache Tuscany, an SCA domain can be used to administer a set of nodes. In SCA, the definitions of composites, components, their implementations, and the nodes they run on belong to what's called an SCA domain. SCA implementations like Tuscany provide administration tools that let a system administrator manage the SCA artifacts in the domain. Using the domain gives you the flexibility to specify installation characteristics of nodes, such as host or port, at the time the nodes are added to the domain instead of in composite files. This article demonstrates how an application comprised of a number of SCA components can be administered via an SCA domain. In this example, we run the Tuscany run time in J2SE without a Web container or application server. For more complex configurations, an SCA domain can also include nodes that run IBM WebSphere Application Server, IBM WebSphere Process Server, Apache Geronimo, Apache Tomcat, or other application servers. A future related article will go further into detail about how you deploy composite applications to these different environments.

See also: the SCA Architecture Guide


A Comparison of JAX-RS Implementations
Mark Little, InfoQueue

There's a strange phenomenon regarding buses: you wait ages for one, then three come along at once! The same seems to be true for JAX-RS implementations. According to Stefan Tilkov's overview, "JAX-RS is an annotation-based API for implementing RESTful web services , based on HTTP, in Java. Essentially, classes and methods are annotated with information that enables a runtime to expose them as resources—an approach that is very different from the one exposed by the servlet programming model. A runtime that implements JAX-RS mediates between the HTTP protocol and the Java classes, taking into account URIs, requested and accepted content types, and HTTP methods. In addition to the Sun-provided reference implementation, Jersey , other implementations are available (in various stages of completion): as part of the popular Restlet framework, the JBoss RESTeasy project, and as part of the Apache CXF web services stack." At the moment we have these JAX-RS implementations: (1) CXF - which is a merger between XFire and Celtix—an Open Source ESB, sponsored by IONA and originally hosted at ObjectWeb; (2) Jersey - the JAX-RS Reference Implementation from Sun; (3) RESTEasy - JBoss's JAX-RS project; (4) Restlet - probably the first REST framework, which existed prior to JAX-RS. Irrespective of the various debates around REST, it cannot be argued that there is a need for REST support in the Java language and JAX-RS is it. But if you're new to REST, which one of these implementations is the one for you? Solomon Duskis has set out to try to shine a light on that debate. [He considers] Maturity of the product, Server-side Integration Strategies, Java Client API, Configurability, Security, and Performance...

See also: Stefan Tilkov's JSR 311 overview


HUD Embraces SOA
Rutrell Yasin, Government Computer News

The U.S. Housing and Urban Development Department is moving toward service-oriented architecture (SOA) that will aid the agency in its mission to promote responsible, sustainable home ownership and maximize options for safe and affordable housing for citizens and residents across the nation. Like many agencies, HUD is saddled with multiple of legacy systems—200 of which are supported by multiple point products. The agency has a portal in which business partners can access information. HUD is building an SOA enterprise service bus that will allow users to more efficiently access services and applications from those legacy systems. HUD is using the Oracle Portal and SOA Suite to perform transactions with business partners, and Microsoft's SharePoint Server to do SOA-based collaboration internally. Using InfoPath and SharePoint, the agency has built a new application to reduce paper-based processes... This has been a manual process until now. The agency created an InfoPath form, which automatically pulls data from legacy systems that track all the public housing authorities. The form then generates letters, sends them electronically and tracks them through the collaboration tool. A HUD SOA success story is the creation of the National Housing Locator System, which provides disaster affected victims with help by letting them search for temporary housing. After the devastation of Hurricane Katrina in 2005, federal officials realized the government didn't have a central database of temporary housing, especially those that would accept housing vouchers for low income residents or for the handicapped, Schlosser said. Most of the databases were regional. HUD was given the task of creating such a database. The agency identified a series of these data providers who already had datasets with this type of information. HUD entered into legal agreements with them to ensure they will keep the data current and accurate.


Where Are All The RDF-based Semantic Web Apps?
Richard MacManus, ReadWriteWeb

In the latest issue of Nodalities, a magazine about the Semantic Web by UK company Talis, there is an article by Talis CTO Ian Davis about the state of Semantic Web applications. Davis says that we're still in "Generation Zero" of the Semantic Web, because there are relatively few compelling apps. Specifically he notes that "there are still only a handful of applications that incorporate RDF at their heart and none of these are using the full potential of the Semantic Web." RDF is the Semantic Web's equivalent of the Web's HTML—its chief characteristic is the ability to ascribe meaning to data. The few commercial RDF apps that Davis mentions are Twine, a beta knowledge management app and one of the few consumer Semantic Apps on the market today; Davis' own Talis (it has built a platform and apps such as for library management); and online reputation management tool Garlik. We also know of occasional RWW commenter Kingsley Idehen's company OpenLink Software, which is building some heavy duty RDF applications. These can all be considered to be 'bottom up' Semantic Web apps. But most Semantic Apps today appear to be 'top down'... We've noted a lot of 'top down' Semantic Apps in recent times. We profiled 10 of them back in November, including Freebase, Powerset, hakia, AdaptiveBlue, TripIt and more... So, Where Are The Commercial RDF Apps? We know that Twine, Talis, Garlik and some others use RDF. But where are the other examples? [...] Let us know in the comments if you know of examples in any of consumer, enterprise, health, or other markets. Finally, dare we pose this controversial question: has the 'top down' Semantic Web "won" and we'll continue to see far more non-RDF apps in the commercial wild but relatively few RDF ones? Tim Berners-Lee and many other Semantic Web proponents will say a resounding 'no' to that. And ok, it will never be that black and white.

See also: Resource Description Framework (RDF)


OpenOffice.org Grows Up
Jason Brooks, eWEEK

Nine years after Sun Microsystems bought StarOffice, the resulting OpenOffice.org project is ready to roll out its 3.0 release. Enhanced format compatibility and features put it on par with Microsoft Office. When Sun Microsystems bought the little-known StarOffice productivity suite in 1999, and soon thereafter released the product's code base as open-source software, it was unclear how far the arguably quixotic initiative might reach—and what damage it could possibly wreak on Microsoft's ironclad grip on the office productivity market. Sun is now on the verge of a major 3.0 release of the project that grew up around that code base, OpenOffice.org. While OpenOffice.org hasn't achieved the same measure of mainstream adoption as its ideological cousin, the Firefox Web browser, the freely available office suite has helped advance the state of file format standardization, to the extent that Microsoft first developed its own open file format and is now prepared to include support for the ISO-standard OpenDocument format in Office 2007. I tested OpenOffice.org 3.0 in a near-final RC3 version, and was pleased with the progress that the project has made toward improving format compatibility and feature parity with Microsoft Office...

See also: the OpenOffice.org web site


Play it Safe on the Interactive Web
John Moore, Federal Computer Week Feature Article

The arrival of Web 2.0 tools on the government scene has created unprecedented forms of collaboration and communication, both inside agencies and between government and citizens. But that's not the only change the tools have brought. Blogs, wikis, social-networking sites, Really Simple Syndication feeds, mashups and other Web 2.0 tools also add complexity and introduce risk to traditional information technology environments. Experts cite user-generated content, ample use of Extensible Markup Language and the ability to quickly combine data from a range of sources as attributes of Web 2.0 that could pose security issues. From an organizational perspective, Web 2.0 also threatens to upset the hierarchy of how information flows through organizations. However, security watchers differ on just how much risk this highly interactive iteration of Web technology creates. Some experts say they believe Web 2.0 harbors the same array of security vulnerabilities as the previous technology generation but also presents a few new twists. Government and industry executives believe some aspects of Web 2.0 technology call for special handling. Here are some guidelines on how to stay on the safe side. Rule 1: Isolate new ventures. Externally facing Web 2.0 sites and applications, in particular, call for an extra measure of security. Newly launched initiatives should be isolated from an organization's IT assets. Rule 2: Keep an eye on XML and other new programming techniques Web 2.0 relies heavily on XML. For example, RSS feeds consist of XML-formatted files. Developers frequently use Asynchronous JavaScript and XML (AJAX) to build mashups and other interactive Web 2.0 applications. Such applications can generate XML traffic as requests for data, and responses flow between browser and server. Rule 3: Be careful whom you trust Web 2.0 thrives on user-contributed content. But with input coming from a variety of sources, should government employees believe what they read? Rule 4: Embed security in Web 2.0 development Security is often an afterthought when deploying new technologies, such as Web 2.0. Instead of retrofitting security into Web 2.0 applications, agencies can build security into systems from the beginning. OTDA implemented a Secure System Development Life Cycle road map to 'help ensure information securing is kept in focus' throughout development...


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2008-10-02.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org