The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: January 23, 2008
XML Daily Newslink. Wednesday, 23 January 2008

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
BEA Systems, Inc. http://www.bea.com



Rails, REST, and Anarchist XML
Simon St. Laurent, O'Reilly Opinion

I've been digging deeper and deeper into Rails, concluding that after many years of frameworks offering me more headaches than benefits, Rails finally provides enough good for me to think it worth using a potentially constraining framework... Today, by default, Rails scaffolding does something really simple for XML: it lets you avoid thinking very hard about the XML coming in and going out. Now, I know, you could if you wanted create Builder templates that let you ensure that all of the XML going out was perfectly structured according to this or that specification, but you don't have to... Rails' RESTful scaffolding also accepts XML documents coming in through PUT or POST. Again, it's not advertising a formal schema (by default, anyway). Instead it seems to be doing something like Examplotron—look at a sample document, see what's there, and imitate it. Rails can get away with this for a couple of reasons. First, the type system within Rails is extremely simple, and not that hard to specify within documents. Second, Rails has a pretty thorough understanding within the framework of how data flows - all that ActiveRecord goodness ensures that Rails knows what's supposed to be in the data, and makes it more comfortable sending and receiving without an external set of checks. Perhaps my favorite part of this approach is that a lot of developers are fixated on the HTML side of things, but the scaffolding generates the XML side too. It comes for free. Left as it arrives, it's an opportunity to open a much wider set of services to XML manipulation, at zero cost to developers. (Well, except for some potential surprise if and when people start using their services that way.) It's taken a decade, and the Rails models are pretty much data-centric rather than the documents I love working with - but we may finally be reaching the point where XML is starting to behave the way Walter Perry said it should, unconstrained by data structures negotiated in advance...


Is Tomcat an Application Server?
Jeff Hanson, Java World Magazine

"In this article I tackle the question of whether Tomcat is an application server. I start by explaining the distinctions between app servers, Web servers, and Java EE containers, and then look at some scenarios where a Web server like Tomcat could be used appropriately as an app server. I show a scaled architecture, starting with the sort of lightweight implementation where Tomcat shines, and concluding with a complex service-oriented architecture, where you would be better off with a full-fledged Java EE application server... Apache Tomcat can be used as an application server, especially for less complex Java EE Web applications. According to some figures, Tomcat is the Web/application server environment most used by Java developers. Tomcat's popularity is due to its ease of use and support for many features considered to be standard in a Java Web application environment, including WAR file deployment, JNDI resources, JDBC data sources, JSP support, session replication, virtual hosting support, clustering support, and JMX-based management and monitoring. Tomcat is also a favorite for Java enterprise development due to the fact that its runtime performance as a standalone server is very competitive. With Tomcat version 6, some new features have been added including asynchronous HTTP request handling via Comet, thread pool sharing, non-blocking connectors, enhanced JMX management and monitoring, Servlet 2.5, and JSP 2.1. Even with these new features, however, Tomcat does not support the entire Java EE stack. Where Tomcat and other Web servers fall short is in the area of features such as distributed transactions, EJBs, and JMS. Applications requiring support for these components are usually more at home in with a Java EE application server such as JBoss, Geronimo, WebLogic, WebSphere, or Glassfish. Many Java EE application servers actually use Tomcat as their Web container.


Perspective: Acid2, Acid3, and the Power of Default
Hakon Wium Lie, CNET News.com

"Acid2 is a complex Web browser test page that shows a smiley face when rendered correctly. The test, published by the Web Standards Project, has been a tremendous success in weeding out browser bugs that stop Web designers from reaching pixel perfection in their pages. Safari and Opera ship Acid2-compliant versions, and the upcoming Firefox 3 will also pass the test. Recently, Microsoft announced that Internet Explorer version 8 can render Acid2, and it showed a screenshot to back the claim. The news was received with joy and excitement in the Web-authoring community. Finally, it seems, Microsoft has decided to take Web standards seriously. Designers will no longer have to spend countless hours trying to get their pages to look right in Internet Explorer while adhering to standards. Unfortunately, I think that the celebration is premature. I predict that IE 8 will not pass Acid2, after all... Acid3 will follow in the footsteps of Acid1 and Acid2; it's a tough one-page test that displays a quirky graphic when rendered correctly. No browser will pass the test at the time of its release. All vendors are equally challenged. Whereas Acid2 was a static Web page, Acid3 will be a dynamic Web application. When browsers are improved to pass Acid3, it will become easier to write Web applications that work interoperably across browsers... The IE 8 team has shown that it can render Acid2 correctly. Now it's time for Microsoft to put its code to good use."

See also: the Microsoft blog


XmlTransform: A General-Purpose XSLT Pre-Processor
Michael Sorens, DexX.com

The general-purpose XML transformer and/or validator discussed in this article, named "XmlTransform" operates on an arbitrarily deep directory tree containing files you want to transform. As output it optionally generates multi-level indices and can even add navigational linkages. XmlTransform's validation capability is reasonably straightforward; it lets you ensure that the set of XML files used for a transformation are valid according to specified XML schema. You may elect to validate input files, output files (after transformation), or both. The program's transformation capability is more interesting. One common application of a transformation engine is as a pre-processor, a very handy thing indeed when designing web pages. Overall, because of the number of options and their effects on the output, XmlTransform does have a fairly steep learning curve, but if you have a problem to tackle that it can handle, it can be quite a time saver. In the real world, XmlTransform originally served to generate static pages on my open source web site. Rather than write in HTML, I can write pages in a shorthand custom XML dialect and let XmlTransform automatically take care of the fancy headers, footers, page linkages, copyright date, and so forth. But XmlTransform is useful in other situations as well. For example, it can act as a SQL documentation generator akin to Ndoc (for C#) or JavaDoc (for Java).

See also: Adding Custom XML Documentation to SQL Code


Watching WADL
Mark Nottingham, Blog

I'm following the discussion of RESTful Web description in general, and WADL in particular, with both difficulty and interest. The first hurdle that a RESTful description format faces is probably the biggest; how it's used by clients. My experience is that WADL provides most of its value on the server-side (e.g., for in-development service modelling, documentation, and review, as well as limited code generation), but much discussion keeps on circling around to the client side, perhaps because of the well-worn footpath off the cliff that WSDL provided. If clients use a WADL file to generate static code that calls the described service without checking to see if the WADL has changed, they're going to be tightly coupled to the WADL definition, and therefore no better than WSDL or any other interface description. Blech. On the other hand, if you use the WADL at runtime to dynamically create the URLs and representations you send and parse the ones you receive, it's all good, and in this way WADL is acting like a Web format should—using hypertext (in this case, a generic XML format) as the engine of application state. In this way, it's no different than the APP service document format or HTML forms, except that WADL is less application-specific in the first case, and more flexible in the latter... I think there will be two ways to get a (somewhat) RESTful Web service into the world, for the foreseeable future. One will be to work with a group of people to identify a broad problem space, standardise one (or a few) media types, defining their semantics, and an application-specific format that glues them together into an interface. Atom Publishing Protocol is a great example of this, and it certainly has legs. The other will be to skip the huddle, define your own formats and semantics, and throw it over the wall, knowing that you'll get your problem solved in the short term, but without the considerable leverage of a widely adopted and understood format and interface...

See also: the WADL (Web Application Description Language) project site


Web 2.0 Security
Shivaram H. Mysore, TrustStix White Paper

Web 2.0 is an umbrella term coined to include technologies used for providing user-centric web based services. Here, the services are architected and programmed so that they can be personalized and used dynamically. The architectural philosophy is called Service Oriented Architecture (SOA). This document provides security aspects for Web 2.0 based Services. It provides a comprehensive list of threats that need to be considered for mitigation when deploying Web 2.0 services. It also provides ideas on mitigating the described threats. The document is intended for CIOs and Enterprise IT Professionals (e.g., Administrators, Directors) who are planning or implementing or deploying Web 2.0 Services, and for Network & System Architects. The paper discusses several security threats, including Feed Injection; Authentication; Validation; Client Side Attacks: Cross-Site Scripting & Forgery; Client Side Attacks: Command Execution and Zones; Client Side Attacks: Generic.


IBM Makes SOA Play with AptSoft Buy
Renee Boucher Ferguson, eWEEK

IBM is looking to beef up its service-oriented architecture portfolio by buying AptSoft, a private company that develops infrastructure software to help companies determine cause-and-effect relationships between business events. AptSoft, based in Burlington, Mass., develops software that falls into the complex event processing category of software that works within a SOA (service-oriented architecture) framework to help trigger BPM (business process management) events. AptSoft's software spans a number of areas in IBM's vast product portfolio groups, including WebSphere, Information Management and Tivoli. IBM also plans to fold AptSoft into newer product categories that include RFID and Web 2.0 capabilities, areas where IBM has ramped up investment over the past couple of years. Complex event processing software helps companies identify patterns and establish connections between events. Once the software determines a trend in events -- whether they occur over a millisecond or over hours or days—it initiates a business process trigger. AptSoft's namesake Director for CEP is a platform that runs on a company's network, where it monitors and correlates activities across applications, Web services, databases and devices, according AptSoft. Based on user-defined rules, the software detects events or patterns—a new customer is added, a product is sold but a shipment isn't scheduled, a prospect researches the same topic on a company's Web site for a week --- and then orchestrates the disparate elements of the infrastructure to execute a process. Where AptSoft fits into IBM's SOA strategy is in its ability to enable a SOA framework that supports what AptSoft refers to as a new class of composite applications (those applications that are made up of) Web services, events from services and events from an event cloud of people, devices, applications, databases and networks.


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
EDShttp://www.eds.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2008-01-23.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org