XML General Articles and Papers: Surveys, Overviews, Presentations, Introductions, Announcements
Other collections with references to general and technical publications on XML:
- XML Article Archive: [June 2003] [May 2003] [April 2003] [March 2003] [February 2003] [January 2003] [December 2002] [November 2002] [October 2002] [September 2002] [August 2002] [July 2002] [April - June 2002] [January - March 2002] [October - December 2001] [Earlier Collections]
- Articles Introducing XML
- Comprehensive SGML/XML Bibliographic Reference List
July 2003
[July 30, 2003] "MPEG Standard Addresses Rights." By Paul Festa. In CNET News.com (July 30, 2003). "The Moving Pictures Experts Group has completed an effort on two digital rights management technologies intended to increase the MPEG standard's appeal to the recording industry and Hollywood. MPEG announced the completion of parts 5 and 6 of MPEG-21, a member of the MPEG family of multimedia standards that defines how audio and video files can play in a wide range of digital environments. The digital rights management (DRM) capabilities are crucial to MPEG-21, as they are to other emerging multimedia standards, so that publishers in the recording and movie industries will adopt the standard without fear of losing control of copyrighted works. Part 5 of the standard, the Rights Expression Language (REL), lets multimedia publishers designate rights and permissions for how consumers can use their content. The REL expression 'play,' for instance, would let the consumer use the material in a 'read only' mode, while other expressions could allow more flexibility in playback and reproduction. REL also lets consumers establish privacy preferences for their personal data. Part 6 of the standard, the Rights Data Dictionary (RDD), defines terms that publishers can use when working with REL..." Other details are given in the announcment "MPEG Approves Another MPEG-21 Technology." See also the note on the relationship between MPEG-21 Part 5 (viz., Information technology -- Multimedia framework (MPEG-21) -- Part 5: Rights Expression Language ) and the XrML-based 'Rights Language' targeted for development within the OASIS Rights Language Technical Committee; the OASIS RLTC was formed in March 2002 to "use XrML as the basis in defining the industry standard rights language in order to maximize continuity with ongoing standards efforts..." Since (a) MPEG Part 5: Rights Expression Language is now (effectively) an ISO FDIS [Final Draft International Standard] and (b) is scheduled to become an ISO Standard in September 2003, and (c) no draft committee specification for an XrML-based rights language has been created within the OASIS RLTC, it appears that the MPEG-21 Part 5 document as an ISO Standard will become the reference standard for the strongly patented ContentGuard/Microsoft XrML rights language technology. References: (1) "MPEG Rights Expression Language"; (2) "Extensible Rights Markup Language (XrML)"; (3) "XML and Digital Rights Management (DRM)."
[July 30, 2003] "Identifying and Brokering Mathematical Web Services." By Mike Dewar (Numerical Algorithms Group, NAG). In Web Services Journal Volume 3, Issue 8 (August 2003), pages 44-46. "The MONET project is a two-year investigation into how service discovery can be performed for mathematical Web services, funded by the European Union under its Fifth Framework program. The project focuses on mathematical Web services for two reasons: first, mathematics underpins almost all areas of science, engineering, and increasingly, commerce. Therefore, a suite of sophisticated mathematical Web services will be useful across a broad range of fields and activities. Second, the language of mathematics is fairly well formalized already, and in principle it ought to be easier to work in this field than in some other, less well-specified areas... MSDL is the collective name for the language we use to describe problems and services. Strictly speaking, it is not itself an ontology but it is a framework in which information described using suitable ontologies can be embedded. One of the main languages we use is OpenMath, which is an XML format for describing the semantics of mathematical objects. Another is the Resource Description Framework Schema (RDFS), which is a well-known mechanism for describing the relationship between objects. The idea is to allow a certain amount of flexibility and redundancy so that somebody deploying a service will not need to do too much work to describe it. An MSDL description comes in four parts: (1) A functional description of what the service does; (2) An implementation description of how it does it; (3) An annotated description of the interface it exposes; (4) A collection of metadata describing the author, access policies, etc... There are two main ways in which it is possible to describe the functionality exposed by a service. The first is by reference to a suitable taxonomy such as the 'Guide to Available Mathematical Software (GAMS)' produced by NIST, a tree-based system where each child in the tree is a more specialized instance of its parent... The second way to describe the functionality exposed by a service is by reference to a Mathematical Problem Library, which describes problems in terms of their inputs, outputs, preconditions (relationships between the inputs), and post-conditions (relationships between the inputs and outputs). The MSDL Implementation Description provides information about the specific implementation that is independent of the particular task the service performs. This can include the specific algorithm used, the type of hardware the service runs on, and so on... In addition, it provides details of how the service is used. This includes the ability to control the way the algorithm works and also the abstract actions that the service supports. While in the MONET model a service described in MSDL solves only one problem, it may do so in several steps. For example, there may be an initialization phase, then an execution phase that can be repeated several times, and finally a termination phase. Each phase is regarded as a separate action supported by the service... While WSDL does a good job in describing the syntactic interface exposed by a service, it does nothing to explain the semantics of ports, operations, and messages. MSDL has facilities that relate WSDL operations to actions in the implementation description, and message parts to the components of the problem description. In fact the mechanism is not WSDL-specific and could be used with other interface description schemes such as IDL... There are many other aspects of Web services -- not least the ability to negotiate terms and conditions of access, measure the quality of the actual service provided, and choreograph multiple services to solve a single problem -- that are still being worked out. The partners in the project's ultimate goal is to develop products and services based on the MONET architecture but the viability of this depends to a large extent on solutions to the other emerging issues. While we are confident that this will happen, it is not yet clear what the timescale will be. The MONET project is currently building prototype brokers that can reason about available services using MSDL descriptions encoded in the W3C's OWL. We are also investigating the applicability of this technology to describing services deployed in the Open Grid Service Architecture (OGSA)..." [alt URL]
[July 30, 2003] "IBM, CA Square Up to HP on Management." By Keith Rodgers. From LooselyCoupled.com (July 30, 2003). "IBM and Computer Associates teamed up at a key web services standards meeting yesterday [2003-07-29] in a surprise rebuff to a submission by Hewlett-Packard. At stake is the future development path of IT management software. Although the initial purpose of the rival proposals is merely to establish standards that govern web services manageability, the ultimate aim is to roll out the same standards as a foundation for the entire IT management spectrum -- not just management of web services, but management of other IT assets through web services. The established systems management giants are also hoping that, by shifting the emphasis back to the wider management framework, they can recapture the market advantage they've currently ceded in web services management to smaller specialist vendors. HP had grabbed headlines on July 21 [2003], when it formally announced it would submit its Web Services Management Framework to the Web Services Distributed Management (WSDM) committee of e-business standards body OASIS. The HP submission had the backing of eight other developers on the committee, including Sun, Oracle, BEA, Iona, Tibco and webMethods... rivals IBM and CA [have] joined forces with web services management specialist Talking Blocks to present their own vision, dubbed WS-Manageability, to the OASIS meeting... The WS-Manageability proposal stems from work that IBM, Computer Associates and Talking Blocks have done for another standards group, the W3C Web Services Architecture Working Group. A primary concern is to make full use of other emerging 'WS-*' web services standards, such as WS-Policy, that form part of the generic web services platform. Any proposed management standard should not stray from its core management mission, they warn, either into defining elements of the generic infrastructure, or into specifying aspects of management applications... The irony of this particular standards battle is that none of the big three systems management vendors -- IBM, HP and CA -- can claim to be leading the field in web services management. It is specialists such as Actional, Amberpoint, Talking Blocks and Infravio who have been making all the running in terms of delivering production software into user deployments, with each of them able to point to several reference customers. [However,] by emphasizing management through web services, the established systems management giants can gain recognition in the web services arena while broadening the issue out to play to their own strengths..." See also: (1) the presentations "Web Services Manageability" and "Management and Web Service Management," referenced in the following bibliographic entry; (2) the HP framework proposal, "HP Contributes Web Services Management Framework Specification to OASIS TC."
[July 29, 2003] "Web Services Distributed Management (WSDM) TC Submission: Web Services Manageability." By Heather Kreger (IBM), Igor Sedukhin (Computer Associates), and Mark Potts (Talking Blocks). PDF from source .PPT. July 24, 2003. 10 pages. Posted to the OASIS WSDM TC list by Ellen Stokes (IBM). Prose adapted from the slides: "#2: As to background, the design started with active involvement of the authors on W3C Web Services Architecture Working Group Management Task Force. To avoid fragmentation of Management Standards the team co-authored a specification to facilitate development of consensus among management vendors. They considered concepts from existing work on Web services as a management platform. The specification is the agreed minimum sufficient set of information that make Web service endpoints manageable. #3: As to main considerations, the design does not imply the implementation of the manageability or the manager. It captures manageability with XML and WSDL and is consistent with existing standards based Web Service infrastructures. It is consistent with existing management models and standards, uses existing infrastructure mechanisms, has an easily extensible model, is easily implementable, reducing impact on Web service development. #4: As to the intention of the submission, the specification will define the model for the manageability of Web services and define access to this manageability. The access and model can be rendered in (a) WSDL 1.1; (b) GWSDL; (c) CIM Models; (4) WSDL 1.2. The specification identifies requirements for more general Web services standards, but does not define them. The team is submitting the Common Base Event specification (XML event format for management events)... #8: As to an extensible manageability, the topics are extensible (new topics can be created; any topic can have aspects added to them [i.e., define new properties, operations, and events]; and new aspects can be created. Manageability information is extensible. #9: As to infrastructure, the specification supports building WS-I basic profile compliant manageable Web services; it leverages non-management specific infrastructure available to us from: (a) WS* e.g., WS-addressing, WS-policy; (b) OGSI e.g., serviceData, notifications; (c) CMM e.g., lifecycle, relationships, metadata. It does not imply a specific management framework. The authorship team intends to submit this work to the OASIS WSDM TC..." See also the presentation "Management and Web Service Management" as posted 2003-07-29; this presentation "offers work to OASIS completed by IBM with contribution from CA and Talking Blocks... It details a frame of reference for Management Applications, Managers, Manageability using Web services and Manageability of Web services. The work also identifies the management concerns pertinent to each and the dependencies in terms of common description that are required..." [source .PPT, cache]
[July 29, 2003] "Introducing BPEL4WS 1.0. Building on WS-Transaction and WS-Coordination." By Dr. Jim Webber and Dr. Mark Little (Arjuna Technologies Limited). In Web Services Journal Volume 3, Issue 8 (August 2003), pages 28-33. With source code and 3 figures. "The value of BPEL4WS is that if a business is the sum of its processes, the orchestration and refinement of those processes is critical to an enterprise's continued viability in the marketplace. Those businesses whose processes are agile and flexible will be able to adapt rapidly to and exploit new market conditions. This article introduces the key features of Business Process Execution Language for Web Services, and shows how it builds on the features offered by WS-Coordination and WS-Transaction. The BPEL4WS model is built on a number of layers, each one building on the facilities of the previous. The fundamental components of the BPEL4WS architecture consists of the following: (1) A means of capturing enterprise interdependencies with partners and associated service links; (2) Message correlation layer that ties together messages and specific workflow instances; (3) State management features to maintain, update, and interrogate parts of process state as a workflow progresses; (4) Scopes where individual activities (workflow stages) are composed to form actual algorithmic workflows. We'll explore the features of this stack, starting with the static aspects of the application -- capturing the relationship between the Web services participating in workflows -- and on to the creation of workflows using the BPEL4WS activities... BPEL4WS is at the top of the WS-Transaction stack and utilizes WS-Transaction to ensure reliable execution of business processes over multiple workflows, which BPEL4WS logically divides into two distinct aspects. The first is a process description language with support for performing computation, synchronous and asynchronous operation invocations, control-flow patterns, structured error handling, and saga-based long-running business transactions. The second is an infrastructure layer that builds on WSDL to capture the relationships between enterprises and processes within a Web services-based environment. Taken together, these two aspects support the orchestration of Web services in a business process, where the infrastructure layer exposes Web services to the process layer, which then drives that Web services infrastructure as part of its workflow activities. The ultimate goal of business process languages like BPEL4WS is to abstract underlying Web services so that the business process language effectively becomes the Web services API. While such an abstract language may not be suitable for every possible Web services-based scenario it will certainly be useful for many, and if tool support evolves it will be able to deliver on its ambition to provide a business analyst-friendly interface to choreographing enterprise systems..." See also: "Introducing WS-Transaction Part II. Using Business Activities," in Web Services Journal Volume 3, Issue 7 (July 2003), pages 6-9. General references in "Business Process Execution Language for Web Services (BPEL4WS)." [alt URL]
[July 29, 2003] "Double Standards." By Sean Rhody (WSJ Editor-in-Chief). In Web Services Journal Volume 3, Issue 8 (August 2003), page 3. "In June I attended the JavaOne conference... and was reminded, once again, that the lack of a single standards body is a serious roadblock to implementation of Web services... I was further reminded of the mess we're in by some of the Web services presentations. While obviously biased toward Java (it was JavaOne, after all), what really got me was the way everyone needed to explain how this specification came from HP, that standard was developed by W3C, and OASIS has a competing specification to some other specification. It's clear that there are too many bodies producing standards, not to mention too many standards themselves. The Java model works somewhat better, with a single standards organization and the JSR process. Rather than develop competing specifications (SAML or WS-Security, for example), the JCP provides guidance from multiple companies toward the creation of a single standard that all Java vendors will comply with. No one has to decide whether to use BPML or BPEL, or the Java equivalent... I would propose that WS-I become the central Web services body, and that the members of the other bodies treat them as the Supreme Court of Web services. Once they rule on a specification, let there be no further disputes. Let's limit the number of specifications so the innovations can go toward making a smaller set of standards better. Of course the WS-I may not want to act as the final arbiter of Web services fate, and for various reasons, many vendors may not want the WS-I as currently constituted to be the sole determining body for Web services..." On WS-I, see "Web Services Interoperability Organization (WS-I)." [alt URL]
[July 29, 2003] "Microsoft Brings Secure Web Services Closer." By John McIntosh. In IT-Director.com (July 28, 2003). "As the noise of secure communications and identify management continues unabated and vendors clamour at the door, Microsoft's recent announcement of Web Services Enhancements 2.0 might have been missed. This is a significant announcement, not because it comes from Microsoft but because of what it potentially means to the Web Services market and the security market... WSE version 2.0 offers new security features should simplify development and deployment of secure Web Service applications that span company boundaries and trust domains, connecting and exchanging information with customer and partner systems. According to the Company, WSE 2.0 means that developers can apply security policies to Web services with minimal lines of code and support interoperability across heterogeneous systems. WSE 2.0 does this by building on the security, routing and attachment capabilities of version 1.0 and adds a foundation for building applications based on Web services specifications published by Microsoft and its industry partners including WS-Security, WS-Policy, WS-SecurityPolicy, WS-Trust, WS-SecureConversation and WS-Addressing... There is within the .NET Framework and WSE 2.0 the ability to do many interesting things in terms of secure application development to support integration and federation of security through the value chain. WSE is important because it introduces for the first time the ability to test the theories behind emerging WS-Security standards. Essentially, is it possible to build a system that can securely expose internal systems to partners as Web services, leveraging existing technology investments to generate future revenue opportunities? Without the following new [WSE] capabilities, the answer to that question would probably be no..." See details in the news story "Security Featured in Microsoft Web Services Enhancements Version 2.0 Technology Preview."
[July 29, 2003] "Using the WS-I Test Tools." By Yasser Shohoud (Microsoft). July 24, 2003. 18 minutes. Tutorial prepared as an MSDN TV Episode; the presentations is played using the Microsoft Windows Media Player. Summary: "The Web Services Interoperability organization (WS-I) has published a draft version of the Basic Profile Test Tools. Yasser Shohoud shows how to use these tools to test your Web service for WS-I Basic Profile conformance." Details: A Beta Release of the WS-I Testing Tools was issued in April 2003 and is available in C# and Java. The WS-I testing tools are designed to help developers determine whether their Web services are conformant with Profile Guidelines. The WS-I Testing Working Group also published draft [June 26, 2003] versions of the WS-I Monitor Tool Functional Specification and WS-I Analyzer Tool Functional Specification. The WS-I Monitor Tool specification edited by Scott Seely (Microsoft) documents the message capture and logging tool. "This tool captures messages and stores them for later analysis. The tool itself will have to capture messages traveling over different protocols and transports. The first version of this tool will focus on being able to accurately capture HTTP based SOAP messages. Also, while many interception techniques are available, this implementation uses a man in the middle approach to intercept and record messages... The Monitor has two distinct sets of functionality: (1) It is responsible for sending messages on to some other endpoint that is capable of accepting the traffic while preserving the integrity of communication between the two endpoints. (2) It is responsible for recording the messages that flow through it to a log file. One can think of these two pieces as an interceptor and a logger. For this first version of the Monitor, the interceptor and logger functionality will exist in the same application. The working group recognizes that we may later desire to separate the interceptor and the logger into two, standalone entities. This design discusses how one would go about structuring an application today that should be able to be broken into separate pieces in future versions..." The WS-I Analyzer Tool specification edited by Peter Brittenham (IBM) documents "the design for Version 1.0 of the analyzer tool, which will be used for conformance testing of WS-I profiles. The purpose of the Analyzer tool is to validate the messages that were sent to and from a Web service. The analyzer is also responsible for verifying the description of the Web service. This includes the WSDL document that describes the Web service, and the XML schema files that describe the data types used in the WSDL service definition. The analyzer tool has a defined set of input files, all of which are used to verify conformance to a profile definition: Analyzer configuration file; Test assertion definition file; Message log file; WSDL for the Web service. The analyzer configuration file and test assertion definition file are described in greater detail in the subsequent sections of the document; the message log file contains the list of messages that were captured by the monitor tool..." See also the WS-I Basic Profile Version 1.0 (Working Group Approval Draft 2003/05/20) and the WS-I Testing Working Group Charter. General references in "Web Services Interoperability Organization (WS-I)."
[July 29, 2003] "Understanding XML Digital Signature." By Rich Salz. In Microsoft MSDN Library (July 2003). ['This article looks at the XML Digital Signature specification, explaining its processing model and some of its capabilities. It provides a low-level understanding of how the WS-Security specification implements its message security feature. The author surveys the XML DSIG specification using the schema definition to describe the features that are available and the processing that is required to generate and verify an XML DSIG document. He starts with the basic signature element (ds:SignedInfo), looks at how it incorporates references to application content to protect that content, and looks at part of the ds:KeyInfo element to see how an application can verify a signature, and perhaps validate the signer's identity. These three aspects provide the most basic and low-level components of protecting the integrity of XML content.'] "Digital signatures are important because they provide end-to-end message integrity guarantees, and can also provide authentication information about the originator of a message. In order to be most effective, the signature must be part of the application data, so that it is generated at the time the message is created, and it can be verified at the time the message is ultimately consumed and processed. SSL/TLS also provides message integrity (as well as message privacy), but it only does this while the message is in transit. Once the message has been accepted by the server (or, more generally, the peer receiver), the SSL protection must be 'stripped off' so that the message can be processed. As a more subtle point, SSL only works between the communication endpoints. If I'm developing a new Web service and using a conventional HTTP server (such as IIS or Apache) as a gateway, or if I'm communicating with a large enterprise that has SSL accelerators, the message integrity is only good up until the SSL connection is terminated. As an analogy, consider a conventional letter. If I'm sending a check to my phone company, I sign the check -- the message -- and put it in an envelope to get privacy and delivery. Upon receipt of the mail, the phone company removes the envelope, throws it away, and then processes the check. I could make my message be part of the envelope, such as by gluing the payment to a postcard and mailing that, but that would be foolish. An XML signature would define a series of XML elements that could be embedded in, or otherwise affiliated with, any XML document. It would allow the receiver to verify that the message has not been modified from what the sender intended. The XML-Signature Syntax and Processing specification (abbreviated in this article as XML DSIG) was a joint effort of the W3C and the IETF. It's been an official W3C Recommendation since February 2002. Many implementations are available..." See general references in "XML Digital Signature (Signed XML - IETF/W3C)."
[July 29, 2003] "Sun's Proposed New Web Services Standards." By Charles Babcock. In InformationWeek (July 39, 2003). "Sun is trying to initiate a new round of Web services with a proposal for a set of standards that work on top of XML and Web Services Description Language (WSDL). But Sun and its partners have yet to say to which standards body they will submit their proposed specification... Arjuna Technologies, Fujitsu Software, Iona Technologies, Oracle, and Sun have teamed up to propose that individual Web services be called up and combined to form 'composite applications.' Through Sun's proposed set of standards, such a composite application would be given a shared runtime environment that could determine the specific systems contributing to the service. It also would be given a coordination agent that made sure applications ran in the correct sequence and a transaction manager that supervised transactions across dissimilar applications. The proposed set is called Web Services-Composite Application Framework, or WS-CAF. Today's leading Web services handle such coordination issues is 'in a very ad hoc manner, if at all,' says Mark Little, chief team architect for Arjuna. The proposed standards will take the guesswork and ambiguities out of how to coordinate services from scattered systems into one composite application, or new Web service, says Ed Julson, Sun's group manager of Web services standards and technology. The alternative, Julson says, is to go forward with competing methods of resolving service issues, as is the case with two of today's Web-services security standards: Web Services-Security proposed by IBM, Microsoft and VeriSign, and Web Services-Reliability proposed by Fujitsu, Hitachi, NEC, Oracle, Sonic Software, and Sun. Among the standards bodies that might receive the Sun proposal are the Oasis Open consortium of vendors setting XML standards; the World Wide Web Consortium; and the Internet Engineering Task Force. 'From a pure technology standpoint, the group isn't breaking new ground,' says Stephen O'Grady of Red Monk, a market research group. Sun and partners are making use of existing technologies, sometimes already in use in deployed Web services, he says. But 'it's a novel and unique approach for creating composite applications composed of distinct Web services.' The most significant part of the proposal may prove to be the way it defines a way to manage transactions in the Web-services context, O'Grady says..." See: (1) the news story "Web Services Composite Application Framework (WS-CAF) for Transaction Coordination"; (2) the Arjuna announcement "Arjuna Enables Reliable Web Services-Based Business Applications with Arjuna XTS. Technology to Address the Reliable Coordination Issues Preventing the Early Adoption of Serious E-Business Solutions Through Web Services."
[July 29, 2003] "XHTML-Print." Edited by Jim Bigelow (Hewlett-Packard). W3C Last Call Working Draft, 29-July-2003. Produced by members of the W3C HTML Working Group as part of the W3C HTML Activity. The Last Call review period ends on 7-September-2003. Latest version URL: http://www.w3.org/TR/xhtml-print. Also in PDF. "XHTML-Print is a member of the family of XHTML Languages defined by the W3C Recommendation Modularization of XHTML. It is designed to be appropriate for printing from mobile devices to low-cost printers that might not have a full-page buffer and that generally print from top-to-bottom and left-to-right with the paper in a portrait orientation. XHTML-Print is also targeted at printing in environments where it is not feasible or desirable to install a printer-specific driver and where some variability in the formatting of the output is acceptable... XHTML-Print is not appropriate when strict layout consistency and repeatability across printers are needed. The design objective of XHTML-Print is to provide a relatively simple, broadly supportable page description format where content preservation and reproduction are the goal, i.e., 'Content is King.' Traditional printer page description formats such as PostScript or PCL are more suitable when strict layout control is needed. XHTML-Print does not utilize bi-directional communications with the printer either for capabilities or status inquiries. This document creates a set of conformance criteria for XHTML-Print. It references style sheet constructs drawn from CSS2 and proposed for CSS3 Paged Media as defined in the CSS Print Profile to provide a strong basis for rich printing results without a detailed understanding of each individual printer's characteristics. It also defines an extension set that provides stronger layout control for the printing of mixed text and images, tables and image collections. The document type definition for XHTML-Print is implemented based on the XHTML modules defined in Modularization of XHTML." Note: this specification is based "in large part on a work by the same name XHTML-Print from the Printer Working Group (PWG), a program of the IEEE Industry Standard and Technology Organization." See general references in "XHTML and 'XML-Based' HTML Modules."
[July 29, 2003] "Microsoft Plays Hiring Hardball." By Darryl K. Taft. In eWEEK Volume 20, Number 30 (July 28, 2003), pages 1, 16. "Like baseball's New York Yankees, Microsoft Corp. has been paying top dollar for top talent in an effort to dominate the new playing fields of XML and Web services. During the past 18 months, the Redmond, Wash., company has gobbled up some of the best-known XML, Web services and application development brains around. Most recently it hired Cape Clear Software Inc. Chief Technology Officer Jorgen Thelin, who last week announced he would be leaving the Web services infrastructure company to join Microsoft. The effort, which runs counter to Microsoft's traditional strategy of scooping up complementary companies, has concerned developers crying foul and claiming the company is only looking to improve its standing among standards groups... Not all developers are happy about the issue, saying that once again the company is using its might irresponsibly. 'Microsoft is trying to buy the standard; you own all the soldiers, and then you win,' said one industry insider, who requested anonymity. 'I have heard hallway grumblings about Microsoft trying to corner the market on Web services experts, especially from companies looking to hire people who can represent them on Web services committees,' said Iona Technologies plc. CTO Eric Newcomer, of Waltham, Mass..."
[July 29, 2003] "Sun, Oracle, Others Propose Transaction Specification Middleware Vendors Publish Web Services Composite Applications Framework." By James Niccolai and Peter Sayer. In InfoWorld (July 28, 2003). "Sun Microsystems Inc., Oracle Corp., Fujitsu Software Corp., Iona Technologies PLC and Arjuna Technologies Ltd. published the Web Services Composite Applications Framework (WS-CAF), designed to solve problems that arise when groups of Web services are used in combination to complete a transaction or share information, the companies said in a joint statement Monday. They plan to submit WS-CAF to an industry standards group and will allow for its use on a royalty-free basis, moves intended to promote its broad use. The initiative appears to lack support so far from some key Web services players, however, including IBM Corp., BEA Systems Inc. and Microsoft Corp., which were not part of the announcement. Web services use standard technologies such as SOAP (Simple Object Access Protocol) and XML (Extensible Markup Language) to link disparate applications in a way that's supposed to be more affordable and flexible than using proprietary messaging systems. Some transactions, such as purchasing a book, are relatively simple to complete, in part because they can be finished instantaneously. Others, such as fulfilling a purchase order or completing an insurance claim, can take days or weeks to process and as such pose problems for Web services developers, the companies said. WS-CAF aims to solve those problems by defining a set of rules for coordinating transactions in such long-running business processes, the group said. WS-CAF actually is a collection of three specifications: Web Service Context (WS-CTX), Web Service Coordination Framework (WS-CF), and Web Service Transaction Management (WS-TXM)... The new specifications add to an already tangled clump of Web services coordination specifications with varying degrees of support from standards bodies. One such, BPEL4WS (Business Process Execution Language for Web Services), has been adopted by OASIS, the Organization for the Advancement of Structured Information Standards, but it still leaves significant gaps that need to be filled, according to [Jeff] Mischkinsky..." See details in "Web Services Composite Application Framework (WS-CAF) for Transaction Coordination."
[July 28, 2003] "Composite Capability/Preference Profiles (CC/PP): Structure and Vocabularies." Edited by Graham Klyne (Nine by Nine), Franklin Reynolds (Nokia Research Center), Chris Woodrow (Information Architects), Hidetaka Ohto (W3C / Panasonic), Johan Hjelm (Ericsson), Mark H. Butler (Hewlett-Packard), and Luu Tran (Sun Microsystems). W3C Working Draft 28-July-2003. Latest version URL: http://www.w3.org/TR/CCPP-struct-vocab/. "This specification was originated by the W3C CC/PP Working Group and has now been passed to the W3C Device Independence Working Group to carry forward towards a Recommendation." Summary: "This document describes CC/PP (Composite Capabilities/Preference Profiles) structure and vocabularies. A CC/PP profile is a description of device capabilities and user preferences that can be used to guide the adaptation of content presented to that device. The Resource Description Framework (RDF) is used to create profiles that describe user agent capabilities and preferences. The structure of a profile is discussed. Topics include: (1) structure of client capability and preference descriptions, AND (2) use of RDF classes to distinguish different elements of a profile, so that a schema-aware RDF processor can handle CC/PP profiles embedded in other XML document types. CC/PP vocabulary is identifiers (URIs) used to refer to specific capabilities and preferences, and covers: [1] the types of values to which CC/PP attributes may refer, [2] an appendix describing how to introduce new vocabularies, [3] an appendix giving an example small client vocabulary covering print and display capabilities, and [4] an appendix providing a survey of existing work from which new vocabularies may be derived... It is anticipated that different applications will use different vocabularies; indeed this is needed if application-specific properties are to be represented within the CC/PP framework. But for different applications to work together, some common vocabulary, or a method to convert between different vocabularies, is needed. (XML namespaces can ensure that different applications' names do not clash, but does not provide a common basis for exchanging information between different applications.) Any vocabulary that relates to the structure of a CC/PP profile must follow this specification. The appendices introduce a simple CC/PP attribute vocabulary that may be used to improve cross-application exchange of capability information, partly based on some earlier IETF work..."
[July 28, 2003] "IBM To Update Portal With Document Management and Collaboration Technology. New WebSphere Software to Ship Next Month." By Elizabeth Montalbano. In CRN (July 28, 2003). "IBM has unveiled an upcoming version of its WebSphere portal that the company says has new capabilities for document management and collaboration. WebSphere Portal Version 5 will feature new functionality that allows solution providers to aggregate content from various back-end resources -- such as human resources, CRM or supply chain applications -- to a single portal, according to IBM. The new software also will include out-of-the-box document management tools that allow solution providers to integrate and manage information, such as a company's financial reports or sales documents, within the portal, IBM executives said. At the same time, the document management tools also will allow end users to view, create, convert and edit basic documents, spreadsheets and other files while working within the portal. In addition, WebSphere Portal Version 5 will include a new Collaboration Center, which leverages IBM Lotus software to allow portal users to interact with various collaborative applications, such as instant messaging, team workplaces and virtual meetings. Other new features in Version 5 include performance enhancements and a simplified installation process..."
[July 27, 2003] "Sun Proposes New Web Services Specifications." By Martin LaMonica. In CNET News.com (July 27, 2003). "Sun Microsystems and a handful of partners have announced they are seeking the approval of Web services specifications for coordinating electronic transactions. Sun, Oracle, Iona Technologies, Fujitsu Software and Arjuna Technologies will submit the specifications, the Web Services Composite Applications Framework (WS-CAF), to either the World Wide Web Consortium or the Organization for the Advancement of Structured Information Standards (OASIS) for development as standards in the next several weeks... WS-CAF, which comprises three individual specifications, proposes a mechanism for coordinating transactions across many machines in multistep business processes. The authors of the specifications hope simplified interactions between Web services will allow companies to assemble business applications with Web services more quickly. The WS-CAF specifications would create a prearranged way to configure systems so that Web services applications from different providers could share important transactional information. For example, administration tools based on WS-CAF would ensure that a consumer making vacation reservations online could coordinate bookings at three different Web sites for travel, car and hotel reservations at the same time. Current business systems have methods for sharing the status of ongoing transactions across different machines. The WS-CAF set of specifications seeks to improve interoperability by standardizing that capability among different providers, said Eric Newcomer, chief technology officer at Iona. The Sun-led group of companies intends to garner input from other IT providers through the standardization process, said Ed Jolson, Sun's group manager for Web services standards..." See details in "Web Services Composite Application Framework (WS-CAF) for Transaction Coordination."
[July 25, 2003] "The Future of XML Documents and Relational Databases. As New Species of XML Documents Are Emerging, Vendors Are Unveiling Increased RDBMS Support for XML." By Jon Udell. In InfoWorld (July 25, 2003). "Having absorbed objects, the RDBMS vendors are now working hard to absorb XML documents. Don't expect a simple rerun of the last movie, though. We've always known that most of the information that runs our businesses resides in the documents we create and exchange, and those documents have rarely been kept in our enterprise databases. Now that XML can represent both the documents that we see and touch -- such as purchase orders -- and the messages that exchange those documents on networks of Web services, it's more critical than ever that our databases can store and manage XML documents. A real summer blockbuster is in the making. No one knows exactly how it will turn out, but we can analyze the story so far and make some educated guesses. The first step in the long journey of SQL/XML hybridization was to publish relational data as XML. BEA Chief Architect Adam Bosworth, who worked on the idea's SQL Server implementation, calls it 'the consensual-hallucination approach -- we all agree to pretend there is a document.' XML publishing was the logical place to start because it's easy to represent a SQL result set in XML and because so many dynamic Web pages are fed by SQL queries. The traditional approach required programmatic access to the result set and programmatic construction of the Web page. The new approach materializes that dynamic Web page in a fully declarative way, using a SQL-to-XML query to produce an XML representation of the data and XSLT to massage the XML into the HTML delivered to the browser. Originally these virtual documents were created using proprietary SQL extensions such as SQL Server's 'FOR XML' clause. There's now an emerging ISO/ANSI standard called SQL/XML, which defines a common approach. SQL/XML is supported today by Oracle and DB2. It defines XML-oriented operators that work with the native XML data types available in these products. SQL Server does not yet support an XML data type or the SQL/XML extensions, but Tom Rizzo, SQL Server group product manager at Redmond, Wash.-based Microsoft, says that Yukon, due in 2004, will... Most of the information in an enterprise lives in documents kept in file systems, not in relational databases. There have always been reasons to move those documents into databases -- centralized administration, full-text search -- but in the absence of a way to relate the data in the documents to the data in the database, those reasons weren't compelling. XML cinches the argument. As business documents morph from existing formats to XML -- admittedly a long, slow process that has only just begun -- it becomes possible to correlate the two flavors of data..." See general references in "XML and Databases."
[July 25, 2003] "Interwoven Unwraps Version 6. New Versions of Platform and Server Offer Content Control for Business Users." By Cathleen Moore. In InfoWorld (July 25, 2003). "Content management vendor Interwoven has launched new versions of its platform and content server with a focus on improving business usability. Interwoven 6 introduces a new customization framework, dubbed the ContentServices UI Toolkit, which is designed to enable customized user interfaces. The release also includes a Web services-based integration toolkit via the Interwoven ContentServices SDK 2.0. The toolkit allows quick and flexible integration with business applications, according to Interwoven officials in Sunnyvale, Calif. Addressing specific business needs, Interwoven 6 also includes offerings aimed at business functions such as sales, services, marketing, and IT. Based on Web services standards, the set of solutions include a Digital Brand Management offering powered by MediaBin; Sales Excellence portals via integration with portals from IBM, BEA, SAP, Plumtree, Sun, and Oracle; and a Global Web Content Management offering for supporting hundreds of Web properties in multiple locations and languages. Meanwhile, Version 6.0 of the TeamSite content server aims to give business users more control over content via the user-friendly ContentCenter interface... The ContentCenter framework includes ContentCenter Standard, an interface designed for business users featuring customizable, portal style content management components that allow users to easily add or modify content. The ContentCenter Professional is a power user interface containing advanced features such as branching, workflow, virtualization, versioning, security, and tool integration..." See details in the announcement: "Interwoven Releases TeamSite 6.0 - The New Benchmark in the Content Management Industry. New Release Empowers Enterprises to Boost Workforce Productivity and Enable Faster, Smarter Decision-Making with an All New, Easy, and Customizable User Experience."
[July 25, 2003] "Standards Stupidities and Tech's Future." By Charles Cooper. In CNet News.com (July 25, 2003). "The technology business may employ more brainy people on a per capita basis than any industry in the world. But when it comes to agreeing on technical standards, the behavior of some of these very bright people more resembles the plot in Dumb and Dumber. An outsider looking in would easily assume that, with all this intellectual firepower, these folks would understand their best interests and would be able to decide how to proceed without a major struggle... when it comes to figuring out the best way to get from A to Z, bruising (and pointless) Silicon Valley clashes over standards are the norm rather than the exception. Hang around this business long enough and you realize that, while the actors change, the script lines remain much the same. And each time one of these donnybrooks erupt, the protagonists say they are only working on behalf of what's good for customers... Back on Planet Earth, however, it rates as a monumental waste of time and energy. The latest bit of grandstanding involves the move to set standards for Web services, for which -- surprise, surprise -- Microsoft and Sun Microsystems are happily bickering and backstabbing each other. This is only the latest debacle in their decades-long rivalry, but it comes at a particularly inopportune time for IT customers who are debating where Web services should fit within their operations. Although the Web services world has been inching toward resolving a lot of its issues, this has been a slow-motion story. If Web services is ever going to live up to its hype, the technology industry needs to make sure that its programming standards guarantee the reliability of XML message transmissions. The two sides agree with that notion. From there, however, it's pistols at 20 paces... Microsoft is pushing something that it calls WS-ReliableMessaging, which was co-developed with IBM, BEA Systems and Tibco. Meanwhile, a competing specification called Web Services Reliable Messaging is being backed by Sun, Oracle, Fujitsu, Hitachi, NEC and Sonic Software..."
[July 24, 2003] "Eve Maler of Sun Microsystems Discusses the Future of Web Services Security." By Janice J. Heiss. From WebServices.org (July 24, 2003). In this article Janice J. Heiss speaks to Sun Microsystems' Eve Maler, vice-chair of the WS-I Basic Security Profile Working Group and currently coordinating editor of the SAML (Security Assertion Markup Language) Technical Committee, seeking an update on the development of Web services security. Maler: [As to when viable Web services security standards will be established] "It's best not to think in black and white terms. There are specifications appearing on the scene that attempt to secure different facets of Web services. As each specification becomes standardized and viable over time, the operation of Web services will be better protected... This may not be fully standardized until late in 2003 and it's important for this work to reflect a clear understanding of the problem space. And after that, there's going to be a lot more work on trust management. So improvements will occur as long as these processes take place in venues that allow the right experts to look at them... Traditional technologies won't always suffice [for Web services security]. First, the trust issues still haven't been fully solved in traditional computing; they haven't scaled to meet our expectations, and Web services present an opportunity to get this right. With Web services, end to end isn't the same as point to point. Messages are going between a requester and a responding service, but they may also pass through several intermediaries, and thus, several possible hubs. Therefore, a technology that focuses solely on securing the transport channel may not be sufficient. You need security technologies that persist past that transient part; without the XML security standards, they don't take advantage of the opportunities inherent in XML's granularity... Sun Microsystems is very concerned with the open specification of standards and the specification of systems that don't rely on a single hub to do all the jobs. We have heard some intimations that a system like Passport will ultimately be a federated system so that you won't always have to go through one Web site to start your journey online. That would be a good thing. The Liberty Alliance takes exactly this federated approach to managing and using your electronic identity. What's best is for all of the relevant security infrastructure for Web services to be standardized in an open venue to be seen by all the right eyes, and especially for the IPR (intellectual property rights) terms to be open enough so that implementations can be widely accepted. This is Sun's goal in participating in Web services security standardization, and it's the key for ensuring that no one company can create lock-in..." See: (1) "Security Assertion Markup Language (SAML)"; (2) security standards reference list at "Security, Privacy, and Personalization."
[July 23, 2003] "Why Choose RSS 1.0?" By Tony Hammond. In XML.com News (July 23, 2003). "RSS, a set of lightweight XML syndication technologies primarily used for relaying news headlines, has been adapted to a wide range of uses from sending out web site descriptions to disseminating blogs. This article looks at a new application area for RSS: syndicating tables of contents for serials publications. Serials newsfeeds -- especially scientific newsfeeds -- differ from regular newsfeeds in that a key requirement for the reader, or more generally for the consumer, of the feed is to be able to cite, or produce a citation for, a given article within the serial. This need for additional information exists across many types of publishing activities. A user may choose not to follow a link directly to some content for whatever good reason, such as preferring to access a locally stored version of the resource. This requires that rich metadata describing the article be distributed along with the article title and link to the article. The need to include metadata within the feed raises the following questions: (1) Which version of RSS best supports the delivery of metadata to users? (2) Which metadata term sets are best employed for supply to users? This article examines both of these issues and then considers how such extensions can actually be used in practice. The primary purpose of syndicating tables of contents for serials is to provide a notification service to inform feed subscribers that a new issue has been published. There are, however, secondary uses for such a syndication service -- that is, to provide access to archival issues resident within a feed repository. The hierarchical storage arrangements for archival issues suggest that one possible resource discovery mechanism might be to have feeds of feeds whereby a feed for an archival volume of issues would syndicate the access URIs for the feeds of the respective issues contained within that volume. This arrangement could even be propagated up the hierarchy whereby a subscription year for a given serial might contain the feed URIs for the volumes within that year, or that a serial feed might contain the feed URIs for the subscription years for that serial. Another way of using a feed of feeds would be for a publisher to publish an RSS feed of all sites that it wanted to syndicate. As an example of such a feed Nature Publishing Group now has a feed which delivers the access URIs for all its current production feeds..." Related news: "RSS 2.0 Specification Published by Berkman Center Under Creative Commons License." See general references in "RDF/Rich Site Summary (RSS)."
[July 23, 2003] "Extending RSS." By Danny Ayers. In XML.com News (July 23, 2003). "The boom of weblogs has boosted interest in techniques for syndicating news-like material. In response a family of applications, known as aggregators or newsreaders, have been developed. Aggregators or newsreaders consume and display metadata feeds derived from the content. Currently there are two major formats for these data feeds: RSS 1.0 and RSS 2.0... The names are misleading -- the specifications differ not only in version number but also in philosophy and implementation. If you want to syndicate simple news items there is little difference between the formats in terms of capability or implementation requirement. However, if you want to extend into distributing more sophisticated or diverse forms of material, then the differences become more apparent. The decision over which RSS version to favor really boils down to a single trade-off: syntactic complexity versus descriptive power. RSS 2.0 is extremely easy for humans to read and generate manually. RSS 1.0 isn't quite so easy, as it uses RDF. It is, however, interoperable with other RDF languages and is eminently readable and processible by machines. This article shows how the RDF foundation of RSS 1.0 helps when you want to extend RSS 1.0 for uses outside of strict news item syndication, and how existing RDF vocabularies can be incorporated into RSS 1.0. It concludes by providing a way to reuse these developments in RSS 2.0 feeds while keeping the formal definitions made with RDF... RSS 1.0's strong point is its use of the RDF model, which enables information to be represented in a consistent fashion. This model is backed by a formal specification which provides well-defined semantics. From this point of view, RSS 1.0 becomes just another vocabulary that uses the framework. In contrast, outside of the relationships between the handful of syndication-specific terms defined in its specification, RSS 2.0 simply doesn't have a model. There's no consistent means of interpreting material from other namespaces that may appear in an RSS 2.0 document. It's a semantic void. But it doesn't have to be that way since it's relatively straightforward to map to the RDF framework and use that model. The scope of applications is often extended, and depending on how you look at it, it's either enhancement or feature creep. Either way, it usually means diminishing returns -- the greater distance from the core domain you get, the more additional work is required for every new piece of functionality. But if you look at the web as one big application, then we can to get a lot more functionality with only a little more effort..." General references in "RDF/Rich Site Summary (RSS)."
[July 23, 2003] "HP Buys Swedish VoiceXML Company. HP to Expand OpenCall Unit With Purchase of PipeBeach." By Gillian Law. In InfoWorld (July 23, 2003). "Hewlett-Packard plans to expand its OpenCall business unit with the purchase of Swedish VoiceXML company PipeBeach... PipeBeach of Stockholm makes interactive voice products for speech-based information portals, such as sports and traffic information systems and phone banking. Ed Verney, director of interactive media platforms in HP's OpenCall unit, said HP has been working in the VoiceXML area for some time, but that it would have taken a further two years to develop products of a similar quality to PipeBeach's technology. HP will take PipeBeach's principal products, including SpeechWeb and SpeechWeb Portal, and integrate them into its own OpenCall suite of telecommunication software, it said in a statement. SpeechWeb is a VoiceXML platform that lets applications and services, located on standard Web servers, be accessed over the phone. It can automatically understand speech in 30 languages, and can also turn text in those languages into speech, according to PipeBeach's Web site. SpeechWeb Portal makes it easier to give access to different information databases through one phone number, and to personalize services, according to PipeBeach. A provider just has to link the SpeechWeb Portal software to a database to produce a voice service, Verney said. 'It's removed a lot of the guess-work'..." See: (1) details in the announcement "HP Acquires PipeBeach to Strengthen Leadership in Growing VoiceXML Interactive Voice Market. Standards-based Products from PipeBeach Bolster HP OpenCall Portfolio and Enhance HP's Ability to Deliver Speech-based Solutions."; (2) general references in "VoiceXML Forum."
[July 22, 2003] Scalable Vector Graphics (SVG) 1.2. Edited by Dean Jackson (W3C). W3C Working Draft 15-July-2003. Latest version URL: http://www.w3.org/TR/SVG12/. Third public working draft of the SVG 1.2 specification, produced by the W3C SVG Working Group as part of the W3C Graphics Activity within the Interaction Domain. "This document specifies version 1.2 of the Scalable Vector Graphics (SVG) Language, a modularized language for describing two-dimensional vector and mixed vector/raster graphics in XML. This draft of SVG 1.2 is a snapshot of a work-in-progress. The SVG Working Group believe the most of the features here are complete and stable enough for implementors to begin work and provide feedback. Some features already have multiple implementations. [The WD] lists the potential areas of new work in version 1.2 of SVG and is not a complete language description. In some cases, the descriptions in this document are incomplete and simply show the current thoughts of the SVG Working Group on the feature. This document should in no way be considered stable. This version does not include the implementations of SVG 1.2 in either DTD or XML Schema form. Those will be included in subsequent versions, once the content of the SVG 1.2 language stabilizes. This document references a draft RelaxNG schema for SVG 1.1..." See details in the news story "New Scalable Vector Graphics 1.2 Working Draft Positions SVG as an Application Platform ."
[July 22, 2003] "Dynamic Scalable Vector Graphics (dSVG) 1.1 Specification." Edited by Gordon G. Bowman. July 09, 2003. Copyright (c) 2003 Corel Corporation. See the expanded Table of Contents and file listing from the distribution package. "This specification defines the features and syntax for Dynamic Scalable Vector Graphics (dSVG), an XML language that extends SVG, providing enhanced dynamic and interactive capabilities that were previously only available via scripting. dSVG is a language for describing UI controls and behaviors in XML [XML10]. It contains eleven types of UI controls ('button', 'checkBox', 'radioButton', 'contextMenu', 'comboBox', 'listBox', 'listView', 'slider', 'spinBox', 'textBox' and 'window'), six categories of behaviors (DOM manipulation, viewer manipulation, coordinate conversion, constraints, flow control and selection ability), and two container elements ('action' and 'share'). dSVG UI controls have instrinsic states (up, down, hover, focus and disabled), which change according to mouse and keyboard events. Their appearances are defined in skins that are completely customizable. These skins can also contain dSVG constraints, which allow the UI controls to be 'intelligently' resized. SVG files with dSVG elements are interactive and dynamic. Behaviors can be directly or indirectly associated to SVG elements or to dSVG UI controls and triggered by specified events. Sophisticated applications of SVG are possible by use of a supplemental scripting language which accesses the SVG Document Object Model (DOM), which provides complete access to all elements, attributes and properties. A rich set of event handlers such as onmouseover and onclick can be assigned to any SVG graphical object. However, scripting has many downsides Note: The distribution file "contains the proposal submitted to the World Wide Web Consortium (W3C) SVG Working Group to enhance SVG's support of enterprise application development for dynamic interfaces. It is a technical specification intended for developers, the SVG community, and the SVG working group to access the content in the proposed changes. It also contains a test suite that includes code not intended for commercial purposes, but provided by Corel to help developers test the specification..." See: (1) details in Dynamic Scalable Vector Graphics (dSVG); (2) "Corel Smart Graphics Studio 1.1 Update Now Available."
[July 22, 2003] "SOAP Message Transmission Optimization Mechanism." Edited by Noah Mendelsohn (IBM), Mark Nottingham (BEA), and Hervi Ruellan (Canon). W3C Working Draft 21-July-2003. Latest version URL: http://www.w3.org/TR/soap12-mtom. ['The W3C XML Protocol Working Group has released the first public Working Draft of the SOAP Message Transmission Optimization Mechanism. Inspired by PASWA and enhancing the SOAP HTTP Binding, this technical report presents a mechanism for improving SOAP performance in the abstract and in a concrete implementation.'] "The first part of this document ('Abstract Transmission Optimization Feature') describes an abstract feature for optimizing the transmission and/or wire format of a SOAP message by selectively re-encoding portions of the message, while still presenting an XML Infoset to the SOAP application. This Abstract Transmission Optimization Feature is intended to be implemented by SOAP bindings, however nothing precludes implementation as a SOAP module. The usage of the Abstract Transmission Optimization Feature is a hop-by-hop contract between a SOAP node and the next SOAP node in the SOAP message path, providing no normative convention for optimization of SOAP transmission through intermediaries. Additional specifications could in principle be written to provide for optimized multi-hop facilities provided herein, or in other ways that build on this specification (e.g., by providing for transparent passthrough of optimized messages). The second part ('Inclusion Mechanism') describes an Inclusion Mechanism implementing part of the Abstract Transmission Optimization Feature in a binding-independant way. The third part ('HTTP Transmission Optimization Feature') uses this Inclusion Mechanism for implementing the Abstract Transmission Optimization Feature for an HTTP binding. This document represents a transmission optimization mechanism which was inspired by a similar mechanism in the PASWA document ('Proposed Infoset Addendum to SOAP Messages with Attachments'). The WG plans to work later on the other parts of that document (assigning media types to binary data in XML infosets and including representations of Web resources in SOAP messages) and to publish other drafts which will include such mechanisms... This specification has currently no well-defined relation with the 'SOAP 1.2 Attachment Feature' specification. However, it may be expected that this specification will supersede the SOAP-AF specification once this specification has reached a stable state..." See also "SOAP 1.2 Attachment Feature, W3C Working Draft 24-September-2002. General references in "Simple Object Access Protocol (SOAP)."
[July 22, 2003] "Web Services Security, Part 4." By Bilal Siddiq. In O'Reilly WebServices.xml.com (July 21, 2003). "In this fourth article of the series, the author puts the pieces together to demonstrate the simultaneous use of all four of the XML security standards (XML signature, XML encryption, WSS, and SAML) in one application. He discusses two important and typical web services security application scenarios and presents two topics: first, how the different web services security standards work together in an XML firewall to protect SOAP servers; second, what the different types of security tokens that you can use in WSS messages are and how they are related to digital signatures and encrypted data... [In this series] We have discussed four XML security standards and two application scenarios (direct authentication and sharing of authentication data) in this series of articles. Before we conclude this series, we would like to point at another important XML security standard being developed by W3C and two other application scenarios of web services security. We have also discussed cryptographic keys in this series of articles. In fact the whole concept of security over the Internet is based on the use of cryptographic keys. The management of cryptographic keys is itself a whole topic, which is of paramount importance. Keeping in mind the importance of key management, W3C is currently developing an XML-based key management standard known as XML Key Management Services (XKMS). Refer to the XKMS page at W3C for further details. Transactions in web services is an important web service application. WS-Transaction is an attempt to standardize the transactional framework in web services. You can download the WS-Transaction specification and check the security considerations section of the specification to see that WS-Transaction uses WSS to secure transactional web services. SOAP-based messaging is another important application of web services. The ebXML Messaging Services (ebMS) standard by OASIS defines the messaging framework for web services. You can download the ebMS specification from the ebXML Messaging page to see how it uses XML signatures..." See also: (1) Part 3, Part 2, and Part 1.
[July 22, 2003] "Groove Tackles Project Management. Workspace Project Edition Taps TeamDirection Tools." By Cathleen Moore. In InfoWorld (July 22, 2003). "Groove Networks has rolled out a project management version of its desktop collaboration software designed to help distributed project teams work together more effectively. Groove Workspace Project Edition bundles project-based collaboration tools from Bellevue, Wash.-based TeamDirection. TeamDirection's Project offering includes project creation tools, status view, role-based permissions, and integration with Microsoft Project. TeamDirection Dashboard, meanwhile, provides cross-project views, filtering and sorting capabilities, and related discussion access. Because project management tools are separate from collaboration and communication products, cross-team and cross-company projects usually require the use of multiple, disconnected applications. This often forces project managers to manually re-enter project updates into a static project document, which is then distributed to team members, according to officials at Groove, in Beverly, Mass. To improve and simplify that process, the Project Edition of Groove Workspace lets project managers create a workspace, and add data manually or through a link to an existing project template or Microsoft Project plan. Team members, who are invited to the workspace via e-mail or instant messaging, each receive a shared, synchronized copy of the tools and project data. Groove software's multi-level presence awareness shows which team members are online and active in the workspace, allowing immediate decision making and problem resolution..." See the product description.
[July 22, 2003] "XML Watch: Tracking Provenance of RDF Data. RDF Tools Are Beginning to Come of Age." By Edd Dumbill (Editor and publisher, xmlhack.com). In IBM DeveloperWorks (July 21, 2003). ['When you start aggregating data from around the Web, keeping track of where it came from is vital. In this article, Edd Dumbill looks into the contexts feature of the Redland Resource Description Format (RDF) application framework and creates an RDF Site Summary (RSS) 1.0 aggregator as a demonstration.'] "A year ago, I wrote a couple articles for developerWorks about the Friend-of-a-Friend (FOAF) project. FOAF is an XML/RDF vocabulary used to describe -- in computer-readable form -- the sort of personal information that you might normally put on a home Web page, such as your name, instant messenger nicknames, place of work, and so on... I demonstrated FOAFbot, a community support agent I wrote that aggregates people's FOAF files and answers questions about them. FOAFbot has the ability to record who said what about whom... The idea behind FOAFbot is that if you can verify that a fact is recorded by several different people (whom you trust), you are more likely to believe it to be true. Here's another use for tracking provenance of such metadata. One of the major abuses of search engines early on in their history was meta tag spamming. Web sites would put false metadata into their pages to boost their search engine ranking... I won't go into detail on the various security and trust mechanisms that will prevent this sort of semantic vandalism, but I will focus on the foundation that will make them possible: tracking provenance... To demonstrate, I'll show you how to use a simple RSS 1.0 document as test data. Recently I set up a weblog site where I force my opinions on the unsuspecting public... Though RSS feeds of weblogs and other Internet sites are interesting from a browse-around, ego-surfing perspective, I believe the real value of a project like this is likely to be within the enterprise. Organizations are excellent at generating vast flows of time-sequenced data. To take a simple example, URIs are allotted for things like customers or projects, then RSS flows of activity could be generated and aggregated. Such aggregated data could then be easily sliced and diced for whoever was interested. For instance, administrators might wish to find out what each worker has been doing, project managers might want the last three status updates, higher-level management might want a snapshot view of the entire department, and so on. It is not hard to imagine how customer relationship management (CRM) might prove to be an area where tools of this sort would yield great benefits... The simple example demonstrated in this article only scratches the surface of provenance tracking with RDF. On the Web, where information comes from is just as important as the information itself. Provenance-tracking RDF tools are just beginning to emerge, and as they become more widely used they will no doubt become more sophisticated in their abilities. The Redland RDF application framework is a toolkit that's definitely worth further investigation. It has interfaces to your favorite scripting language; it runs on UNIX, Windows, and Mac OS X..." See general references in: (1) "Resource Description Framework (RDF)"; (2) "RDF Site Summary (RSS)."
[July 22, 2003] "IBM Adds Grid Computing to WebSphere." By Peter Sayer. In ComputerWorld (July 22, 2003). "IBM will add some grid-computing capabilities to the enterprise edition of its WebSphere Application Server, allowing companies to squeeze more performance from disparate Web applications running on clusters of servers through better load balancing, it announced Monday. 'This is something to bring grid capabilities to commercial customers. It's about the ability to balance Web server workloads in a more dynamic way than has ever been possible before,' said Dan Powers, IBM's vice president of grid computing strategy. Grid computing is seen as a way to deliver computing power to applications as it is needed, in much the same way that the power grid delivers electricity from many sources to where it is needed. In this case, rather than assigning fixed functions, such as serving up Web pages or handling back-office transactions, to particular machines in a cluster running WebSphere software, the software update allows each server to take on any task, depending on workload.."
[July 22, 2003] "WSRP: The Web Services Standard for Portals." By Lowell Rapaport. In Transform Magazine (July 2003). "Let's say you have a Web portal that distributes data originating from a remote third party. If the remote application is a Web service, then the portal application can address the service's API. Formatting and displaying the data returned from the Web service is the responsibility of the portal. This is fine if you have just one or two remote Web services to incorporate into your portal, but what happens if you have a dozen? The greater the number of Web services, the higher the cost of integration. Web Services for Remote Portals (WSRP) is an emerging standard designed to simplify integration. WSRP is expected to cut portal development costs by standardizing the way a remotely executed portlet integrates with a portal. WSRP specifies how a Web service downloads its results to a portal in HTML via simple object access protocol (SOAP. The specification's goals are similar to JSR 168: both promote the standardization of portlets. 'JSR 168 and WSRP are closely related,' says Carol Jones, chief architect of the WebSphere Portal at IBM. 'Portlets written in JSR 168 will be WSRP compatible. 168 deals with the life cycle of a portlet -- what gets called when it's time for it to render itself. WSRP addresses how you take a portlet and use it on a different portal. Once WSRP is adopted, users should be able to take a portlet from one company's product and install it in another.' JSR 168 integrates portals and portlets at the application layer while WSRP works at the communications layer. Based on XML and SOAP, WSRP portlets are transferred over the Internet using hypertext transfer protocol and don't require any special programming or security changes to a consumer's firewall. If a JSR 168 portlet is run remotely, the consumer's firewall has to be modified to support distributed Java applications. WSRP is inherently distributed..." See: (1) OASIS Web Services for Remote Portlets TC; (2) WSRP specification advances toward an OASIS Open Standard; (3) "JSR 168 Portlet API Specification 1.0 Released for Public Review." General references in "Web Services for Remote Portals (WSRP)."
[July 22, 2003] "Web Services Spending Down But Not Out." By Martin LaMonica. In BusinessWeek Online (July 22, 2003). ['A new Gartner survey finds that Web services projects remain a top priority for corporations despite budget cutbacks that are due to the economic downturn.'] "Shrinking IT budgets have forced corporations to cut back on Web services spending, but such projects still remain a top priority, according to a Gartner report released Wednesday. Web services is an umbrella term for a set of XML-based standards and programming techniques that make it simpler to share information between applications. Once touted as a boon to consumers conducting transactions with e-commerce providers, Web services have instead resonated with corporations as a relatively cost-effective way to integrate disparate systems. In an Internet survey of 111 North American companies, Gartner found that 48 percent of respondents have had to pare back spending on Web services application development projects because of the economic slowdown. A full one-third of survey participants said they are continuing to invest in Web services over the next two years despite the grim economic environment... The findings indicate that corporate America has a strong commitment to using Web services, according to Gartner analysts. Web services development projects are at the top of the list of company priorities and are one of the last budgets to be raided when budget cuts are made. The survey found that in the next 12 months, 39 percent of respondents plan to use Web services to share data between internal applications, such as sales automation and order management systems. And 54 percent expect to use Web services for both internal applications and to share information with outside business partners in the next year..." See details in the Gartner Survey announcement: "Gartner Survey Shows Despite U.S. Economic Slowdown Companies Continuing Web Services Development."
[July 22, 2003] "WSDL First." By Will Provost. In O'Reilly WebServices.xml.com (July 22, 2003). "Web services vendors will tell you a story if you let them. 'Web services are a cinch,' they'll say. 'Just write the same code you always do, and then press this button; presto, it's now a web service, deployed to the application server, with SOAP serializers, and a WSDL descriptor all written out.' They'll tell you a lot of things, but probably most glorious among them will be the claim that you can develop web services effectively without hand-editing SOAP or WSDL. Does this sound too good to be true? Perhaps the case can be made that in some cases SOAP has been relegated to the role of RPC encoding, that it's no more relevant to the application developer than IIOP or the DCOM transport. When it comes to WSDL, though, don't buy it. If you're serious about developing RPC-style services, you should know WSDL as well as you know WXS [W3C XML Schema]; you should be creating and editing descriptors frequently. More importantly, a WSDL descriptor should be the source document for your web service build process, for a number of reasons, including anticipating industry standardization, maintaining fidelity in transmitting service semantics, and achieving the best interoperability through strong typing and WXS. The willingness in some quarters to minimize the visibility of service description betrays a more basic and troubling bias, one which has to do with code-generation paths and development process. It assumes that service semantics are derived entirely from application source code. There are two viable development paths for RPC-style service development: from implementation language to WSDL and vice-versa. In fact, to start from the implementation language is the weaker strategy... WSDL first offers a clear advantage in interoperability of generated components. Under the WS-I Basic Profile, and in all typical practice, web services rely on WXS as the fundamental type model. This is a potent choice. WXS offers a great range of primitive types, simple-type derivation techniques such as enumerations and regular expressions, lists, unions, extension and restriction of complex types, and many other advanced features. To put it simply, WXS is by far the most powerful type model available in the XML world. It's more flexible than relational DDLs and much more precise and sophisticated than the type system of many programming languages. Why would we choose to use anything else to express service semantics? What good are WXS's advanced features if they can't be mapped to the implementation language?... For new service development, and even for most adaptations of existing enterprise code assets, the WSDL-to-Impl path is the most robust and reliable; it also fits the consensus vision for widely available services based on progressively more vertical standards. It does a better job of preserving service semantics as designed, and it offers best interoperability based on the rich type model of WXS..." General references in "Web Services Description Language (WSDL)" and "XML Schemas."
[July 22, 2003] "Web Services and Sessions." By Sergey Beryozkin. In O'Reilly WebServices.xml.com (July 22, 2003). "Web services are becoming an important tool for solving enterprise application and business-to-business integration problems. An enterprise application is usually exposed to the outside world as a single monolithic service, which can receive request messages and possibly return response messages, as determined by some contract. Such services are designed according to the principles of a service-oriented architecture. They can be either stateless or stateful. Stateful services can be useful, for example, for supporting conversational message exchange patterns and are usually instance or session-based, but they are monolithic in the sense that the session instantiation is always implicit. In general, a service-oriented approach (simple interactions, complex messages) may be better suited to building stateful web services, especially in the bigger B2B world, where integration is normally achieved through an exchange of XML documents. Coarse-grained services, with their API expressed in terms of the document exchange, are likely to be more suitable for creating loosely coupled, scalable and easily composable systems. Yet there still exists a certain class of applications which might be better exposed in a traditional session-oriented manner. Sometimes a cleaner design can be achieved by assigning orthogonal sets of functionality to separate services, and using thus simpler XML messages as a result. Such web services are fine-grained... If you believe that for a particular use case a fine grained design can result in a better interface, and that a reasonable compromise with respect to those problems can be achieved, then such a route should at least be explored. It is likely we'll see some standardization efforts in this area of state and resource management in the near future. Meanwhile, this article will look at ways of building stateful web services. In particular we highlight different ways of defining service references and identifying individual sessions..."
[July 21, 2003] "Introduction to JSR 168 - The Java Portlet Specification." From Sun Microsystems. Whitepaper. 19 pages. "The Java Specification Request 168 Portlet Specification (JSR 168) standardizes how components for portal servers are to be developed. This standard has industry backing from major portal server vendors. The specification defines a common Portlet API and infrastructure that provides facilities for personalization, presentation, and security. Portlets using this API and adhering to the specification will be product agnostic, and may be deployed to any portal product that conforms to the specification. An example, the Weather Portlet, is provided by Sun to demonstrate the key functionality offered by the Portlet API: action request handling, render request handling, render parameters, dispatching to JavaServer Pages (JSP) technology, portlet tag library, portlet URLs, portlet modes, portlet cache and portlet preferences... The specification defines a common Portlet API and infrastructure that provides facilities for personalization, presentation, and security. Portlets using this API and adhering to the specification will be product agnostic, and may be deployed to any portal product that conforms to the specification. IT Managers benefit from the ability to support multiple portal products, thus accommodating the unique business needs of various departments and audiences. The compliant portlets can be deployed to all compliant portal frameworks without extensive engineering changes. For developers, the specification offers code reusability. Developers who want to portal enable their applications can create and maintain one set of JSR 168 compliant portlets. These portlets can be run on any JSR 168 Portlet Specification compliant portal server with few, if any, modifications. The Portlet Specification addresses the following topics: The portlet container contract and portlet life cycle management; The definition of window states and portlet modes; Portlet preferences management; User information; Packaging and deployment; Security; JSP tags to aid portlet development..." See also the sample portlet code supplied by Sun. Details in the news story "JSR 168 Portlet API Specification 1.0 Released for Public Review." [cache]
[July 21, 2003] "Identity-Management Plans Draw Praise." By Steven Marlin. In InformationWeek (July 17, 2003). "Liberty Alliance and SAML earn plaudits from the Financial Services Technology Consortium for making single sign-on easier for customers. The Financial Services Technology Consortium, a financial-services research group, last week praised two identity-management proposals, Liberty Alliance and Security Assertion Markup Language, for sparing customers the chore of maintaining multiple sets of IDs and passwords. By supporting single sign-on, Liberty Alliance and SAML have the potential to advance Web services initiatives, the FSTC says. Web services -- online applications that invoke other applications via standard protocols--now require that users authenticate themselves to each application, analogous to someone having to present a building pass at the front entrance to a building and then again at the elevator, the office door, the lavatory, etc. SAML, an XML-based specification of the Organization of Structured Information Standards, defines messages known as assertions containing information such as whether a person has already authenticated himself and whether the person has authority to access a particular resource. By exchanging assertions, online applications verify that users are who they claim to be without requiring them to log in. Liberty Alliance, a 2-year-old project backed by 170 companies, has published a set of technical and business guidelines for a 'federated' identity model in which the user logs in once at the beginning of a transaction and SAML assertions provide authentication at the intermediate stages. By enabling companies to automate the task of authenticating customers, employees, suppliers, and partners, the Liberty Alliance and SAML remove an obstacle to the adoption of Web services. Web services' potential can't be realized until organizations can manage trusted relationships without human intervention, says Michael Barrett, president of Liberty Alliance and VP of Internet strategy at American Express... A four-month review by the financial consortium concluded that Liberty Alliance and SAML have the potential to quell consumer fears over identity theft. The review was backed by Bank of America, Citigroup, Fidelity Investments, Glenview State Bank, J P. Morgan Chase & Co., National City Bank, University Bank, and Wells Fargo Bank. Although banks have moved to protect themselves against attacks from hackers, viruses, and network sabotage, they've been poor at communicating the steps they've taken to protect customers from online fraud, says George Tubin, a senior analyst in TowerGroup's delivery-channels service..." See: (1) "Liberty Alliance Publishes Business Requirements and Guidelines for Identity Federation"; (2) general references in "Liberty Alliance Specifications for Federated Network Identification and Authorization."
[July 21, 2003] "SPML Passes Demo As Multi-Platform Provisioning Specification." By Vance McCarthy. In Enterprise Developer News (July 15, 2003). "OASIS execs passed a hurdle last week, as they successfully demoed the Service Provisioning Markup Language (SPML) as an XML-derived standard for multi-platform provisioning during last week's Catalyst Conference. SPML 1.0 is an XML-derivative that proposes to enable organizations to automate, centralize, and manage the process of provisioning user access to internal and external corporate systems and data. SPML was designed to work with the W3C's recently ratified SOAP 1.2 and the OASIS SAML and WS-Security specifications. Just published on June 1, SPML is now out of OASIS technical committee consideration, and being reviewed by OASIS at large membership, which could approve the standard in late August. In the demo, a fictitious PeopleSoft employee was remotely created, sending an SPML 'document' via SOAP to the PeopleSoft application. Before arriving directly at the PeopleSoft, the document -- or the XML schema -- was sent through a messaging multiplexer, which created a duplicate (or 'sub-document') and sent it to other privileged systems. The implication is that vendor-specific adapters could be replaced by open, standard XML schema which would allow different enterprise systems to more easily, and cost-effectively interoperate and keep one another in synch. Aside from PeopleSoft, supporters of SPML include BMC Software, BEA Systems, Novell, Sun Microsystems, Business Layers, Entrust, OpenNetwork, Waveset, Thor Technologies, and TruLogica... SPML was designed to work with the W3C's recently ratified SOAP 1.2 and the OASIS SAML and WS-Security specifications. Other security standards in process at OASIS include WS-Security for high-level security services, XACML for access control, XCBF for describing biometrics data and SAML for exchanging authentication and authorization information..." See: (1) "OASIS Member Companies Host SPML Identity Management Interoperability Event"; (2) "Sun and Waveset Provide Identity Management Solution for PeopleSoft Using SPML"; (3) general references in "XML-Based Provisioning Services."
[July 21, 2003] "XSLT Performance in .NET." By Dan Frumin. In O'Reilly ONDotnet.com (July 14, 2003). "The Microsoft .NET Framework brings with it many new tools and improvements for developers. Among them is a very rich and powerful set of XML classes that allow the developer to tap into XML and XSLT in their applications. By now, everyone is familiar with XML, the markup language that is the basis for so many other standards. XSLT is a transformation-based formatter. You can use it to convert structured XML documents into some other form of text output -- quite often HTML, though it can also generate regular text, comma-separated output, more XML, and so on... Before the Microsoft .NET Framework was released, Microsoft published the XML SDK, now in version 4.0. The XML SDK is COM-based, and so can be used from any development language, not just Microsoft .NET. Its object model is also a little different than the .NET implementation, and therefore requires a bit of learning to use. But in the end, the XML SDK can do the same things for XSLT that the .NET Framework offers. Which raises the question: how do these two engines compare to each other in performance? This article will answer that question... Looking at the results, we can see that in a single end-to-end operation, the cost of the COM overhead can offset the advantages gained in transformation. This is especially true for smaller XML files (20 to 40 nodes). However, the margin of difference grows as the input files grow in size and as the transformation grows in complexity. When dealing with these scenarios, developers should consider using MSXML as well as two techniques to optimize their applications. First, consider storing the XSLT transform objects (including IXSLProcessor) in some shared location (e.g., a static member) for future use. This eliminates the cost of creating and preparing the XSLT objects and allows for a reusable transformation object that can simply be applied to XML input. Second, developers should consider creating their own COM object garbage collector for the XML files, especially if they are large in size..."
[July 21, 2003] "An XML Fragment Reader." By William Brogden. In XML.com (July 21, 2003). ['A lot of XML parsing deals with document fragments, as opposed to complete documents. Unfortunately, XML parsers prefer to deal with entire documents. A Java solution turns out to be simple and quite flexible. It enables you to combine many bits of XML formatted character streams to feed an XML parser.'] "With the release of the Java SDK 1.4, XML parser classes joined the standard Java release, creating a standard API for parser access. Thus, in the org.xml.sax package, you'll find the InputSource class. An InputSource object can feed a character stream to either a SAX or a DOM parser. You can create an InputSource from a Reader, the basic Java class for streams of characters. A workable plan of attack is to create a class extending Reader that can supply characters to an InputStream from a sequence of character stream sources..."
[July 21, 2003] "From XML to Wireless, Office Suites Move With the Times. Enhanced Basics and Added Features Change the Dynamics of Office Suites " By Cecil Wooley. In Government Computer News Volume 22, Number 19 (July 21, 2003). "Office suites are even more indispensable than paper at government agencies. Probably 90 percent of every work task starts within an office suite and combines elements of e-mail, word processing, databases, graphics, spreadsheets, networking, instant messaging and presentations... [In this review] I tested four leading office suites, grading them for quality, ease of use and price. I looked for applications that could interact with each another and, to a lesser extent, with programs that were not in suites. When multiple versions of a suite were available, I chose the version with the most components not designed for specific users -- for example, accountants or Web developers. Microsoft Office, the king of office suites, has by far the largest market share. Office 11, originally due last month, won't arrive until August because of the large number of new features. Microsoft Corp. submitted a late beta version of Office 11 for this review... Overall, Office 11 represents a minor upgrade. The functions that changed since Office XP, especially XML compatibility and the integration of SharePoint Team software, were necessary. The rest was mostly cosmetic. Office remains the top suite for good reason: simple functions, tight integration and excellent business tools for users at all skill levels. Corel WordPerfect Office 11, with mainstays WordPerfect 11, Quattro Pro 11, Presentations 11 and Paradox 10, has run a distant second to Microsoft Office 11 for some time. But since the early 1980s WordPerfect has retained many loyal government users, particularly in legal offices... The WordPerfect file format has changed little since Version 6.1, so archived data is still compatible. There was even a tool to convert older documents to XML format plus an XML editor that made using the converted work simple. Corel apparently has embraced XML even more thoroughly than Microsoft has. Corel also stuck to its strengths. I could print documents with all their coding and save them in Adobe Portable Document Format without any additional steps. For the legal community, there's a wizard to draw up court pleadings... I found StarOffice completely functional though lacking many extras such as document sharing. The drawbacks: no contact manager, scheduler or e-mail client. You could always use Microsoft Outlook Express for e-mail, but that would mean adding programs to the suite and eliminating the plug-and-play advantage... If you need a basic office suite and have little to spend, StarOffice 6 can do the job. Just don't look for the extras that most users have come to expect from other office suites..."
[July 21, 2003] "BEA Ships WebLogic Platform 8.1. The Suite Includes BEA's Application Server and JRockit Java Virtual Machine." By James Niccolai. In ComputerWorld (July 18, 2003). "BEA Systems Inc. has announced the general availability of WebLogic Platform 8.1, the latest edition of its suite of Java server software for developing, deploying and integrating business applications. The suite includes BEA's application server and JRockit Java virtual machine, which were released in March 2003, as well as new editions of its portal server, integration server and Workshop development environment. The products can be downloaded together or separately from BEA's Web site. See the annnouncement: "BEA WebLogic Platform 8.1 Ships. New Products Offer Faster Time to Value by Converging the Development and Integration of Applications, Portals and Business Processes."
[July 21, 2003] "EIPs More Compelling Than Ever." By Jim Rapoza. In eWEEK (July 21, 2003). "While interest in many enterprise applications has cooled in the last few years, companies remain hot on enterprise information portals. And why not? Portals provide the much-needed ability to integrate and unify access to a company's applications, back-end systems, data sources and content repositories. And unlike many other pricey enterprise applications, EIPs continue to show an excellent return on investment. However, although the attractiveness of portals hasn't changed much, the applications themselves -- as well as the companies that provide them -- have changed a great deal. In eWEEK Labs' last big comparison of EIPs almost two years ago, many of the products we reviewed were moving toward greater use of XML and Java. Based on the products we review here and on other recent stand-alone portal reviews, that move now appears to be complete. In fact, all six of the EIPs we tested this time around are based on Java server technology and use XML heavily in their data structures. Not surprisingly, then, they all did a good job of consuming and creating Web services during our tests. For this eValuation, eWEEK Labs tested many of the major EIPs, which have all been revised during the last few months: Art Technology Group Inc.'s ATG 6.0, BEA Systems Inc.'s WebLogic Portal 8.1, Computer Associates International Inc.'s CleverPath Portal 4.51, Plumtree Software Inc.'s Corporate Portal 5.0, Sybase Inc.'s Enterprise Portal 5.1 and Vignette Corp.'s Application Portal 4.5. We decided not to include in this review portals that are tightly tied to specific back-end applications, such as SAP AG's MySAP... Portal consolidation may be easier now, given that all the systems are similar in their underlying architecture. These similarities will also prove to be a boon to companies implementing EIPs: Just a couple of years ago, implementing portals often meant learning new portlet languages and dealing with unfamiliar server applications. Now, expertise in Java and XML is enough to develop for any portal application. Still, these things are far from commodity products. Companies need to answer questions such as the following to ensure that the portal they're buying will meet their needs. Does the portal make application integration simple? Can multiple portal implementations work together? Does the portal integrate well with existing security infrastructures? Can portal systems be easily managed and monitored? When doing a large comparative review such as this one, one product sometimes jumps clearly to the fore -- either through superior capabilities in all areas or a high level of innovation. In our EIP review, no one product was clearly superior to the others, and all of the products did well in our tests. However, several of the products we tested excelled in specific areas. In development of portlets and Web applications, BEA's WebLogic Portal and its WebLogic Workshop provided one of the best environments we've seen for creating these applications. Plumtree Corporate Portal offered very high levels of customization and design flexibility. And Vignette's Application Portal provided the best and most detailed portal administration interface we've seen..." See the annnouncement on BEA: "BEA WebLogic Platform 8.1 Ships. New Products Offer Faster Time to Value by Converging the Development and Integration of Applications, Portals and Business Processes."
[July 21, 2003] "The Security Components Exchange Protocol (SCXP)." By Yixian Yang (Information Security Center, Beijing University of Posts and Telecom, BUPT). IETF Internet Draft. Reference: draft-yang-scxp-00. June 2003, expires December 2003. Section 7 supplies the SCXP XML DTDs (SCXP DTD, channelType Option DTD, channelPRI Option DTD). "This document describes the Security Components Exchange Protocol (SCXP), an application-level protocol for exchanging data between security components. SCXP supports mutual-authentication, integrity, confidentiality and replay protection over a connection-oriented protocol. SCXP is designed on Blocks Extensible Exchange Protocol (BEEP), and it can be looked upon a profile of BEEP in a way. BEEP is a generic application protocol framework for connection-oriented, asynchronous interactions. Within BEEP, features such as authentication, privacy, and reliability through retransmission are provided. A chief objective of this protocol is to exchange data between security components..." See also: "Blocks eXtensible eXchange Protocol Framework (BEEP)."
[July 21, 2003] "SCO Takes Aim at Linux Users." By Stephen Shankland and Lisa M. Bowman. In CNET News.com (July 21, 2003). "SCO Group, a company that says Linux infringes on its Unix intellectual property, announced on Monday that it has been granted key Unix copyrights and will start a program to let companies that run Linux avoid litigation by paying licensing fees. The company, which is at the heart of a controversial lawsuit over Linux code, said it plans to offer licenses that will support run-time, binary use of Linux to all companies that use Linux kernel versions 2.4 and later SCO sparked a major controversy in the Linux world in March, when it sued IBM, saying the company had incorporated SCO's Unix code into Linux and seeking $1 billion in damages. The company alleged, among other things, trade secret theft and breach of contract. SCO then updated its demands in June, saying IBM owed it $3 billion. In the meantime, it sent out letters to about 1,500 Linux customers, warning them that their use of Linux could infringe on SCO's intellectual property. The claim of copyrights on the Unix code in question may raise the stakes in the dispute. Some attorneys say a copyright claim, which was not included in the earlier allegations against IBM, could be easier for the company to prove. SCO said prices for licensing its Unix System V source code would be announced in coming weeks. Pricing will be based on the cost of UnixWare 7.13, the company's current Unix product. SCO, at least initially, isn't directly targeting home users of Linux, McBride said..."
[July 21, 2003] "XQuery and SQL: Vive la Différence." By Ken North. In DB2 Magazine (Quarter 3, 2003). "Sometimes SQL and XML documents get along fine. Sometimes they don't. A new query language developed by SQL veterans is promising to smooth things over and get everything talking again. It's impossible to discuss the future of the software industry without discussing XML. XML has become so important that SQL is no longer the stock reply to the question, 'What query language is supported by all the major database software companiesfi' The new kid on the block is XQuery, a language for running queries against XML-tagged documents in files and databases. A specification published by the World Wide Web Consortium (W3C) and developed by veterans of the SQL standards process, XQuery emerged because SQL -- which was designed for querying relational data -- isn't a perfect match for XML documents. Although SQL works quite well for XML data when there's a suitable mapping between SQL tables and XML documents, it isn't a universal solution. Some XML documents don't reside in SQL databases. Some are shredded or decomposed before their content is inserted into an SQL database. Others are stored in native XML format, with no decomposition. And the nature of XML documents themselves poses other challenges for SQL. XML documents are hierarchical or tree-structured data. They're self-describing in that they consist of content and markup (tags that identify the content). In SQL databases, such as DB2, individual rows don't contain column names or types because that information is in the system catalog. The XML model is different. As with SQL, schemas that are external to the content they describe define names and type information. However, it's possible to process XML documents without using schemas. XML documents contain embedded tags that label the content. But unlike SQL, order is important when storing and querying XML documents. The nesting and order of elements in a document must be preserved in XML documents. Many queries against documents require positional logic to navigate to the correct node in a document tree. When shredding documents and mapping them to columns, it's necessary to store information about the document structure. Even mapping XML content to SQL columns often requires navigational logic to traverse a document tree. Other requirements for querying XML documents include pattern matching, calculations, expressions, functions, and working with namespaces and schemas... For these and other reasons, the W3C in 1998 convened a workshop to discuss proposals for querying XML and chartered the XML Query Working Group..." General references in XML and Databases."
[July 21, 2003] "Foundry Networks Launches XML Switching for Load Balancer. TrafficWorks Ironware OS is Used in its Upper-Layer Load Balancing Switches." By Matt Hamblen. In ComputerWorld (July 21, 2003). "Foundry Networks Inc. today will release a new version of its TrafficWorks Ironware operating system, which is used in its upper-layer load-balancing switches. A key ingredient of Version 9.0 is XML switching capability to control and direct traffic based on XML tags, which should make it easier to control e-commerce traffic over extranets connected to suppliers and customers, Foundry executives said. Among other features, 9.0 also includes an enhancement to provide denial-of-service (DOS) protection, which protects servers against TCP SYN and TCP ACK attacks, according to San Jose-based Foundry. This protection comes at 1.5 million packets/sec., a 15-fold increase over previous versions. Bryan A. Larrieu, vice president of voice, data and system security at CheckFree Corp. in Atlanta, has been testing Version 9.0 and is especially pleased with the DOS protection improvements. He said he has used prior version and had hoped for such an enhancement..."
[July 21, 2003] "Content-Centric XML: Coming Soon to an Intranet Near You?" By Robert J. Boeri. In (July 20, 2003). "Content-centric XML hasn't followed its original five-year script. Celebrating its fifth birthday as a standard last February, XML was supposed to supplant HTML, shift the burden of processing Web sites from servers to underutilized client PCs, and achieve the holy grail of 'create once, reuse many times.' Although use of XML to transfer information between applications was one of the World Wide Web Consortium's original goals, emphasis was on content-centric XML: Web pages and documents. What happened...? Although XML originally emphasized text content, multimedia use is also increasing, especially on intranets. And now here's the XML-intranet connection. Intranets often provide employees with external newsfeeds. It's easy linking to these feeds, if you're satisfied with employees jumping outside the firewall or viewing them in a pop-up window. If that's all you want, then basic HTML (or XHTML) works fine. But if you'd like to store that news locally, index and search it, or contribute new content expressed in XML, consider NewsML. 'News Markup Language' is an XML schema conceived by Reuters, developed and ratified by the International Press Telecommunications Council, and increasingly a standard for composing and delivering news. NewsML provides a way to produce news and maintain its metadata, and it supports text and rich media. Intranets can use automated processes to deliver NewsML content to a wide variety of devices such as financial service desktops, Web sites, and mobile phones. Journalists can write news stories using standard XML authoring tools in several languages... Although NewsML isn't a one-size-fits-all model (no model is), its adoption is growing both by news organizations creating syndicated content as well as intranets delivering that content. NewsML works its magic with a carefully conceived schema that packages news items, regardless of language or media type, with robust metadata. News 'envelopes' contain one or more news items, which in turn contain one or more components in one or more written or spoken languages. Envelopes describe items with attributes like date and time sent, news service, and priority. Text, images, video, and sound can be packaged in an item as hyperlinks. And Gregor Geiermann, a consultant with NetFederation Interactive Media, actually uses NewsML and has success stories to tell..." General references in "NewsML."
[July 21, 2003] "Auto-ID Center Opens Demo Lab." By [RFID Journal Staff]. In RFID Journal News (July 11, 2003). ['The center today opened a robotic demonstration lab at its facility in Cambridge, England, to show off RFID's manufacturing capabilities.'] "Most of the focus on low-cost RFID has been on moving items from manufacturer to distribution center to store. Today, the Auto-ID Center opened a robotic demonstration at its facility in Cambridge, England, which shows the value of robots being able to identify unique items... The demonstration highlights automatic picking, placing, storage and flexible packaging. The lab has product bins where tagged items are stored before being packed. There is a packing area, where empty gift boxes come in, and a storage area for individual items that haven't been packed. A robot in the middle of the station can perform several different tasks. The robot chooses from a variety of Gillette products, including razors and deodorants, to assemble a gift pack. There are two different types of packaging. As a new package comes into the station, the RFID tag on it tells the robot what type of package it is and triggers the order... [In the Auto-ID Center's system] the RFID tag contains an EPC, a serial number that identifies the unique item. When a reader picks up an EPC code, it sends the number to a computer running something called a Savant. Savants are distributed software programs that manage data. They can, for instance, eliminate duplicate codes if two readers pick up the same item. The Savant sends the EPC to an Object Name Service, which is similar to the Web's Domain Name Service. ONS points the Savant to a Physical Markup Language (PML) server where data on the product is stored. PML is [based upon] XML, created by the Auto-ID Center to describe products in ways computers could understand and respond to. The PML server then sends instructions to the robot. Mark Harrison, a research associate at the Auto-ID Center, says the the robot needs only to be connected to the Internet. Instructions can be sent from a PML server located literally anywhere in the world; to reduce latency, of course, it makes sense to use a PML server located fairly close to the robot. Harrison says that the interaction between the item and the robot happens quickly because only a small fragment of the PML file is actually sent to the robot..." Note, on the (evidently misplaced) concern for privacy, WRT RFID: "Big Brother's Enemy," by RFID Journal editor Mark Roberti. See: (1) Auto-ID Center website; (2) "Physical Markup Language (PML) for Radio Frequency Identification (RFID)."
[July 20, 2003] "Debate Flares Over Weblog Standards. Despite Technical Battles, Weblogs Prepare to Alter the Collaboration and Content Management Space." By Cathleen Moore. In InfoWorld (July 18, 2003). "Weblogs are poised to roil the status quo of enterprise collaboration and content management despite recent debate regarding the protocols underpinning the technology. Quietly flourishing for years with tools from small vendors, online personal publishing technology has skyrocketed in popularity during the past year, attracting serious interest from megaplayers such as AOL and Google. This summer, AOL plans to launch a Weblog tool dubbed AOL Journals, while Google continues to digest Pyra Labs, acquired earlier this year. Most Weblogs are currently fueled by RSS, known both as Really Simple Syndication and RDF (Resource Description Framework) Site Summary. Based on XML, RSS is a Web publishing format for syndicating content, and it is heralded for its simple yet highly effective means of distributing information online. Although not officially sanctioned by a standards body, the format enjoys wide adoption by RSS content aggregators and publishing systems. Media companies such as the BBC, The New York Times, and InfoWorld currently support RSS... Despite the undisputed popularity and proven utility of RSS, a new standard is emerging in an attempt to lay the foundations for the Weblog's future. Originally dubbed Echo and now rechristened as Atom, the effort is described as a grassroots, vendor-neutral push to address some of the limitations of RSS. Rather than adding to the existing RSS specification, development on these issues has splintered off into a separate effort due to disagreement among community members as to the purpose and direction of RSS. The idea is to build on the foundation of RSS, according to Anil Dash , vice president of business development at Six Apart, a San Francisco-based Weblog vendor. 'The reason there is a need for something else is that there are new types of data and richer and more complex connections we are trying to do that RSS is not meant to do,' Dash said. Critics charge that the multiple versions of RSS, the number of which ranges between two and five depending on whom you talk to, are causing confusion and are hindering interoperability. 'To date, people [involved with RSS] have failed to converge on one version and make the confusion go away,' Antarctica's Bray said. Other issues with RSS include the lack of an API component for editing and extending Weblogs. RSS uses separate APIs, metaWeblog and Blogger , which are controlled by Userland Software and Google , respectively. Atom will be necessary for enterprises that 'want interoperability or need to exchange data with someone who is outside the firewall,' Six Apart's Dash said..." General rferences in "RDF Site Summary (RSS)."
[July 16, 2003] "XML Semantics and Digital Libraries." By Allen Renear (University of Illinois at Urbana-Champaign), David Dubin (University of Illinois at Urbana-Champaign), C. M. Sperberg-McQueen (MIT Laboratory for Computer Science), and Claus Huitfeldt (Department for Culture, Language, and Information Technology, Bergen University Research Foundation). Pages 303-305 (with 14 references) in Proceedings of the Third ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL 2003, May 27-31, 2003, Rice Univerersity, Houston, Texas, USA). Session on Standards, Markup, and Metadata. "The lack of a standard formalism for expressing the semantics of an XML vocabulary is a major obstacle to the development of high-function interoperable digital libraries. XML document type definitions (DTDs) provide a mechanism for specifying the syntax of an XML vocabulary, but there is no comparable mechanism for specifying the semantics of that vocabulary -- where semantics simply means the basic facts and relationships represented by the occurrence of XML constructs. A substantial loss of functionality and interoperability in digital libraries results from not having a common machine-readable formalism for expressing these relationships for the XML vocabularies currently being used to encode content. Recently a number of projects and standards have begun taking up related topics. We describe the problem and our own project... Our project focuses on identifying and processing actual document markup semantics, as found in existing document markup languages, and not on developing a new markup language for representing semantics in general... XML semantics in our sense refers simply to the facts and relationships expressed byXML markup. It does not refer to processing behavior, machine states, linguistic meaning, business logic, or any of the other things that are sometimes meant by 'semantics'. [For example:] (1) Propagation: Often the properties expressed by markup are understood to be propagated, according to certain rules, to child elements. For instance, if an element has the attribute specification lang='de', indicating that the text is in German, then all child elements have the property of being in German, unless the attribution is defeated by an intervening reassignment. Language designers, content developers, and software designers all depend upon a common understanding of such rules. But XML DTDs provide no formal notation for specifying which attributes are propagated or what the rules for propagation are. (2) Class Relationships and Synonymy: XML itself contains no general constructs for expressing class membership or hierarchies among elements, attributes, or attribute values -- one of the most fundamental relationships in contemporary information modeling. (3) Ontological variation in reference: XML markup might appear to indicate that the same thing, is-a-noun, is-a-French-citizen, is-illegible, has-been-copyedited; but obviously either these predicates really refer to different things, or must be given non-standard interpretations. (4) Parent/Child overloading: The parent/child relations of the XML tree data structure support a variety of implicit substantive relationships... These examples demonstrate several things: what XML semantics is, that it would be valuable to have a system for expressing XML semantics, and that it would be neither trivial nor excessively ambitious to develop such a system. We are not attempting to formalize common sense reasoning in general, but only the inferences that are routinely intended by markup designers, assumed by content developers, and inferred by software designers... The BECHAMEL Markup Semantics Project led by Sperberg-McQueen (W3C/MIT) grew out of research initiated by in the late 1990s and is a partnership with the research staff and faculty at Bergen University (Norway) and the Electronic Publishing Research Group at the University of Illinois. The project explores representation and inference issues in document markup semantics, surveys properties of popular markup languages, and is developing a formal, machine-readable declarative representation scheme in which the semantics of a markup language can be expressed. This scheme is applied to research on information retrieval, document understanding, conversion, preservation, and document authentication. An early Prolog inferencing system has been developed into a prototype knowledge representation workbench for representing facts and rules of inference about structured documents." See general references in "XML and 'The Semantic Web'."
[July 16, 2003] "The XML Log Standard for Digital Libraries: Analysis, Evolution, and Deployment." By Marcos André Gonçalves, Ganesh Panchanathan, Unnikrishnan Ravindranathan, Aaron Krowne, and Edward A. Fox (Virginia Polytechnic and State University, Blacksburg, VA); Filip Jagodzinski and Lillian Cassel (Villanova University, Villanova, PA). Pre-publication version of the paper delivered at JCDL 2003. Pages 312-314 (with 5 references) in Proceedings of the Third ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL 2003, May 27-31, 2003, Rice Univerersity, Houston, Texas, USA). Session on Standards, Markup, and Metadata. ['The authors describe current efforts and developments building on our proposal for an XML log standard format for digital library (DL) logging analysis and companion tools. Focus is given to the evolution of formats and tools, based on analysis of deployment in several DL systems and testbeds.'] "In 2002 we proposed an XML log standard for digital libraries (DLs), and companion tools for storage and analysis. The goal was to minimize problems and limitations of web servers, search engines, and DL systems log formats (e.g., incompatibility, incompleteness, ambiguity). Accordingly, our new format and tools allow capturing a rich, detailed set of system and user behaviors supported by current DL systems. In this paper, we report advances based on analysis of experimentation and deployment in several DL systems and testbeds. We hope that discussion of this work will move the community toward agreement on some DL log standard, which is urgently needed to support scientific advance... Our next generation DL logger will enhance this communication by allowing direct, peer-to-peer communication between DL components and the (componentized) log tool. Following the philosophy of the Open Archives Initiative, we intend to use standard (or slightly extended) lightweight protocols, to allow this direct communication, therefore promoting interoperability and reuse. In particular, the extended OAI (XOAI) set of protocols defined by the ODL approach, provides specialized OAI protocols for several DL services and can serve as a foundation for such communications... The design of the log analysis tools is highly object oriented, with little or no coupling between modules. The design makes modification and creation of new modules very easy. In the case where a novel statistic is required or in the case that a new XML format feature is added, a new module can be built and connected to the already existing set of modules. The modular design of the log analysis tools also will allow for more advanced analysis capabilities to be integrated into future versions. The current document search and browse output statistics provide information about the total number of hits for each document as well as a breakdown of hits based on aspects of the server domain... Our formats and tools have evolved to deal with the results of such experiments. With the interest demonstrated by many DLs and institutions (e.g., CiteSeer, MyLibrary, Daffodil) in adopting the format and tools, we expect soon to release stable versions of both. Once this phase is achieved, other research issues will become the focus of future efforts, such as richer analysis and evaluation, and efficient use of distributed storage..." See: (1) the Digital Library XML Logging Standard and Tools project website; (2) "An XML Log Standard and Tool for Digital Library Logging Analysis," in Proceedings of the Sixth European Conference on Research and Advanced Technology for Digital Libraries (Rome, Italy, September 16-18, 2002); (3) background in "Streams, Structures, Spaces, Scenarios, Societies (5S): A Formal Model for Digital Libraries" (Technical Report TR-03-04, Computer Science, Virginia Tech). [cache]
[July 16, 2003] "Logic Grammars and XML Schema." By C. M. Sperberg-McQueen (World Wide Web Consortium / MIT Laboratory for Computer Science, Cambridge MA). Draft version of paper prepared for Extreme Markup Languages 2003, Montréal. "This document describes some possible applications of logic grammars to schema processing as described in the XML Schema specification. The term logic grammar is used to denote grammars written in logic-programming systems; the best known logic grammars are probably definite-clause grammars (DCGs), which are a built-in part of most Prolog systems. This paper works with definite-clause translation grammars (DCTGs), which employ a similar formalism but which more closely resemble attribute grammars as described by [D. Knuth, 'Semantics of Context-Free Languages,' 1968] and later writers; it is a bit easier to handle complex specifications with DCTGs than with DCGs. Both DCGs and DCTGs can be regarded as syntactic sugar for straight Prolog; before execution, both notations are translated into Prolog clauses in the usual notation... Any schema defines a set of trees, and can thus be modeled more or less plausibly by a grammar. Schemas defined using XML Schema 1.0 impose some constraints which are not conveniently represented by pure context-free grammars, and the process of schema-validity-assessment defined by the XML Schema 1.0 specification requires implementations to produce information that goes well beyond a yes/no answer to the question 'is this tree a member of the set?' For both of these reasons, it is convenient to use a form of attribute grammar to model a schema; logic grammars are a convenient choice. In [this] paper, I introduce some basic ideas for using logic grammars as a way of animating the XML Schema specification / modeling XML Schema... The paper attempts to make plausible the claim that a similar approach can be used with the XML Schema specification, in order to provide a runnable XML Schema processor with a very close tie to the wording of the XML Schema specification. Separate papers will report on an attempt to make good on the claim by building an XML Schema processor using this approach; this paper will focus on the rationale and basic ideas, omitting many details..." See also the abstract for the Extreme Markup paper [Tuesday, August 5, 2003]: "The XML Schema specification is dense and sometimes hard to follow; some have suggested it would be better to write specifications in formal, executable languages, so that questions could be answered just by running the spec. But programs are themselves often even harder to understand. Representing schemas as logic grammars offers a better approach: logic grammars can mirror the wording of the XML Schema specification, and at the same time provide a runnable implementation of it. Logic grammars are formal grammars written in logic-programming systems; in the implementation described here, logic grammars capture both the general rules of XML Schema and the specific rules of a particular schema." Note: the paper is described as an abbreviated version of "Notes on Logic Grammars and XML Schema: A Working Paper Prepared for the W3C XML Schema Working Group"; this latter document (work in progress 2003-07) provides "an introduction to definite-clause grammars and definite-clause translation grammars and to their use as a representation for schemas." General references in "XML Schemas."
[July 15, 2003] "Testing Structural Properties in Textual Data: Beyond Document Grammars." By Felix Sasaki and Jens Pönninghaus (Universität Bielefeld). [Pre-publication draft of paper published] in Literary and Linguistic Computing Volume 18, Issue 1 (April 2003), pages 89-100. "Schema languages concentrate on grammatical constraints on document structures, i.e., hierarchical relations between elements in a tree-like structure. In this paper, we complement this concept with a methodology for defining and applying structural constraints from the perspective of a single element. These constraints can be used in addition to the existing constraints of a document grammar. There is no need to change the document grammar. Using a hierarchy of descriptions of such constraints allows for a classification of elements. These are important features for tasks such as visualizing, modelling, querying, and checking consistency in textual data. A document containing descriptions of such constraints we call a 'context specification document' (CSD). We describe the basic ideas of a CSD, its formal properties, the path language we are currently using, and related approaches. Then we show how to create and use a CSD. We give two example applications for a CSD. Modelling co-referential relations between textual units with a CSD can help to maintain consistency in textual data and to explore the linguistic properties of co-reference. In the area of textual, non-hierarchical annotation, several annotations can be held in one document and interrelated by the CSD. In the future we want to explore the relation and interaction between the underlying path language of the CSD and document grammars..." See: (1) the abstract for LitLin; (2) the research group's publication list; (3) the related paper "Co-reference annotation and resources: a multilingual corpus of typologically diverse languages", in Proceedings of the Third International Conference on Language Resources and Evaluation (LREC-2002); (4) related references in "Markup Languages and (Non-) Hierarchies." [source PDF]
[July 15, 2003] "Identifying Metadata Elements with URIs: The CORES Resolution." By Thomas Baker (Birlinghoven Library, Fraunhofer-Gesellschaft) and Makx Dekkers (PricewaterhouseCoopers). In D-Lib Magazine Volume 9, Number 7/8 (July/August 2003). ISSN: 1082-9873. "On 18-November-2002, at a meeting organised by the CORES Project (Information Society Technologies Programme, European Union), several organisations regarded as maintenance authorities for metadata elements achieved consensus on a resolution to assign Uniform Resource Identifiers (URIs) to metadata elements as a useful first step towards the development of mapping infrastructures and interoperability services. The signatories of the CORES Resolution agreed to promote this consensus in their communities and beyond and to implement an action plan in the following six months. Six months having passed, the maintainers of Global Information Locator Service (GILS), ONIX, MARC 21, CERIF, DOI, IEEE/LOM, and Dublin Core report on their implementations of the resolution and highlight issues of relevance to establishing good-practice conventions for declaring, identifying, and maintaining metadata elements more generally. In June 2003, the resolution was also endorsed by the maintainers of UNIMARC. After presenting the text of the CORES Resolution and its three 'clarifications', the article summarises the position of each signatory organisation towards assigning URIs to its metadata elements, noting any practical or strategic problems that may have emerged... The article closes with a few general observations about these first steps towards the clarification of shared conventions for the identification of metadata elements and perhaps, one can hope, towards the ultimate goal of improving interoperability among a diversity of metadata communities. In the six months since the signing of the CORES Resolution, the signatories have worked towards translating their commitments into practical URI assignment and persistence policies. Given the need to evaluate the impact of design decisions and to build consensus in the communities behind the standards, it was perhaps too ambitious to expect that policies could be finalised and URIs assigned within just thirty-six weeks. However, having such a short fuse for such a specific set of tasks has highlighted a number of areas where forms of good practice have yet to emerge... Beyond mandating the assignment of URIs to 'elements', the Resolution left it up to the signatories to decide exactly what that means in the context of a particular standard and which other entities, such as sets of elements or values in controlled vocabularies, should also be so identified. Some interesting questions have arisen in this regard: (1) Should the URI of an element reflect a hierarchical context within which it is embedded? (2) If organisation A creates a URI designating an entity maintained by organisation B, and organisation B then creates its own URI for the same entity, by what etiquette or mechanism can the redundant identifiers be cross-referenced or preferences declared? (3) If semantically identical elements are shared across multiple element sets maintained by an organisation, should they each be assigned a separate URI or share one common URI? (4) Should successive historical versions of an element should share a single, unchanging URI or should each version should be assigned its own URI, perhaps by embedding a version number in the URI string? [...] The Resolution leaves it to the signatory organizations what the URIs should look like and explicitly says that no assumptions should be made that URIs resolve to something on the Web... The Resolution is silent about how the URIs assigned can be used in asserting semantic relationships between elements in different sets. URIs were seen as a useful common basis for asserting the relationship of elements in a diversity of applications to shared ontologies such as the Basic Semantic Register or the <indecs> Data Dictionary, or to formally express the relationship between two element sets in the machine-processable and re-usable form of an RDF schema. Facilitating the expression and processing of such assertions in the interest of interoperability between different forms of metadata was seen by its signatories as the longer-term significance of the CORES Resolution..."
[July 15, 2003] "Using the OAI-PMH ... Differently." By Herbert Van de Sompel (Digital Library Research and Prototyping, Los Alamos National Laboratory), Jeffrey A. Young (OCLC Office of Research), and Thomas B. Hickey (OCLC Office of Research). In D-Lib Magazine Volume 9, Number 7/8 (July/August 2003). ISSN: 1082-9873. "The Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) was created to facilitate discovery of distributed resources. The OAI-PMH achieves this by providing a simple, yet powerful framework for metadata harvesting. Harvesters can incrementally gather records contained in OAI-PMH repositories and use them to create services covering the content of several repositories. The OAI-PMH has been widely accepted, and until recently, it has mainly been applied to make Dublin Core metadata about scholarly objects contained in distributed repositories searchable through a single user interface. This article describes innovative applications of the OAI-PMH that we have introduced in recent projects. In these projects, OAI-PMH concepts such as resource and metadata format have been interpreted in novel ways. The result of doing so illustrates the usefulness of the OAI-PMH beyond the typical resource discovery using Dublin Core metadata. Also, through the inclusion of XSL stylesheets in protocol responses, OAI-PMH repositories have been directly overlaid with an interface that allows users to navigate the contained metadata by means of a Web browser. In addition, through the introduction of PURL partial redirects, complex OAI-PMH protocol requests have been turned into simple URIs that can more easily be published and used in downstream applications... Through the creative interpretation of the OAI-PMH notions of resource and metadata format, repositories with rather unconventional content, such as Digital Library usage logs, can be deployed. These applications further strengthen the suggestion that the OAI-PMH can effectively be used as a mechanism to maintain state in distributed systems. [We] show that simple user interfaces can be implemented by the mere use of OAI-PMH requests and responses that include stylesheet references. For certain applications, such as the OpenURL Registry, the interfaces that can be created in this manner seem to be quite adequate, and hence the proposed approach is attractive if only because of the simplicity of its implementation. The availability of an increasing amount of records in OAI-PMH repositories generates the need to be able to reference such records in downstream applications, through URIs that are simpler to publish and use than the OAI-PMH HTTP GET requests used to harvest them from repositories. This article shows that PURL partial redirects can be used to that end..." General references in "Open Archives Metadata Set (OAMS)."
[July 15, 2003] "IBM Introduces EPAL for Privacy Management." By John Fontana. In Network World (July 09, 2003). "IBM has introduced a set of tools that will help companies automatically set and manage privacy policies that govern access to sensitive data stored in corporate applications and databases. IBM's new XML-based programming language called Enterprise Privacy Authorization Language (EPAL) allows developers to build policy enforcement directly into enterprise applications. The move is another in a series by IBM to create a suite of tools and software to support identity management, a broad initiative that relies on user identity to control access and secure systems. EPAL allows companies to translate clearly stated privacy policies into a language a machine can read and act upon. 'You may have a policy that says your primary care physician can look at some private patient data, but only in specific situations,' says Arvind Krishna, vice president of security products for IBM. 'We don't know how to do that with technology, we need a common language. With EPAL, you can go from an English language description of a policy to an XML-based representation of that policy.' Krishna says the key is that privacy is based on the purpose for accessing the information and not just on an identity of the person seeking access. EPAL builds on current privacy specifications, namely the Platform for Privacy Preferences (P3P) that provide privacy controls for information passed between business applications and consumers with browsers. EPAL lets companies use those privacy controls internally with their corporate users. The language will be part of an infrastructure that will include monitors that are built into the interface of corporate applications and databases and perform the enforcement of policies. IBM will use its Tivoli Privacy Manager as a hub that the monitors plug into to check policies. The Privacy Manager will store policies, as well as, log and audit access to data as a means to document policy enforcement..." See details and references in "IBM Releases Updated Enterprise Privacy Authorization Language (EPAL) Specification."
[July 15, 2003] "New OpenOffice on the Threshold." By David Becker. In CNET News.com (July 15, 2003). "The first major upgrade of OpenOffice moved a step closer with the introduction of a near-final version of the revamped open-source software. A 'release candidate' version of OpenOffice 1.1 is available now through the Web site of the organization behind the productivity package. With commercial software, the release candidate is the edition sent to manufacturers for distribution. But OpenOffice developers will make a few final tweaks to 1.1 before declaring a final version next month, said Sam Hiser, co-leader of the marketing project for OpenOffice.org. OpenOffice is the free, open-source sibling of Sun Microsystems' StarOffice, a software package that includes a word processor, spreadsheet application and other software tools. The package competes with Microsoft's dominant Office product, but can open and save files in Office formats. While it's dwarfed in market share terms by Microsoft Office, OpenOffice is slowly winning a following, thanks in part to its cost advantages and its ability to work with files created by Microsoft applications. Key additions to OpenOffice 1.1 include the ability to export files in the portable document format (PDF) created by Adobe Systems and in Macromedia's Flash animation format. Both standards are widely used by Web publishers and usually require the use of special authoring software... Version 1.1 also incorporates more support for XML (extensible markup language), the format increasingly embraced as the standard for exchanging data between disparate computing systems. Besides allowing people to save files in industry-standard XML, OpenOffice 1.1 is also designed to work with third-party 'schemas' (custom XML configurations), including those Microsoft plans to use in the upcoming version of Office. In addition, OpenOffice 1.1 offers support for non-Latin character sets, allowing easier creation of customized versions of OpenOffice for specific languages. The software is currently available in 30 languages, and another 60 localization projects are under way..." See: (1) "OpenOffice.org XML File Format"; (2) "XML File Formats for Office Documents"; (3) related news OpenGroupware.org Announces Open Source Project for Groupware Server Software.
[July 15, 2003] "Startup Unveils New Web-Services Language." By Charles Babcock. In InformationWeek (July 15, 2003). "A startup called Clear Methods has produced Water, an XML programming language, and a run-time environment for Water code, called Steam Engine, which it's offering as part of a pure Web-services platform... Not only can content be built in XML format and transferred with XML-based messaging, it also can be processed at its destination with XML commands and application code, says Clear Methods CEO Michael Plusch. 'If you have a syntax that's compatible with XML and document representation in XML, you can have a Web-services platform that just lives in XML,' he says, rather than the mix of programming and scripting languages that make up the typical Web site. One goal of XML as a programming language is to avoid the passing of XML data from Perl to Java to perhaps Visual Basic or C as it reaches its destination. Plusch and co-founder Christopher Fry were founders of the portal software firm Bowstreet Inc. They began Clear Methods, a six-employee company, in 2001. The Water syntax was composed and Steam was first deployed as a run-time environment in March 2002; this version is the debut of Water with Steam Engine 3.10. A document hand-off mechanism that's written in XML will be more versatile in handling XML data than a language like Java that struggles to make the connection. But Water and Steam Engine are intended as more than a handshake mechanism. Water is an object-oriented language, and programmers who learn it will build class libraries, a body of code from which a family of software objects may be rapidly built and modified. There are few users to date, but one of them, Ben Koo, an engineering doctoral candidate at MIT, says he's been using an early version of Water for two years in a research project he directs on how to model complex hardware and software systems..." See the Water website.
[July 15, 2003] "Microsoft Previews Upgraded Web Services Pack. Security to Get Boost." By Paul Krill. In InfoWorld (July 15, 2003). Microsoft has released a "a preview of an upcoming update to its Web Services Enhancements (WSE) kit, focusing on security. Due for general release by the end of this year, the free kit for Visual Studio .Net users is intended to enable development of what Microsoft describes as advanced Web services. WSE 2.0 features security improvements and enables developers to build Web services that are compliant with a set of Web services specifications released by Microsoft and IBM, including WS-SecurityPolicy and WS-Addressing. These specifications, which have not yet been submitted to an industry standards organization, would receive a volume boost by developers who use Microsoft's kit... Microsoft is releasing an early version of the kit to give users and vendors time to review it and provide feedback. The new version builds on the security, routing, and attachment capabilities of Version 1.0 of WSE. Version 2.0 provides a message-based object model that supports multiple transports, including HTTP and TCP, and asynchronous and synchronous communications, according to Microsoft. In synchronous communications, messages are sent and the sender must wait for a reply, unlike asynchronous communications, in which a request can be sent and retrieved without waiting for a reply. Asynchronous communications is useful for long-running transactions such as with routing of payroll requests or purchase orders... Said Bill Evjen, technical director for Reuters, in Saint Louis: 'The technology is moving so fast that if we had to wait for a standards body like OASIS to approve them, we would really be behind the curve; Reuters is confident that the backing of vendors such as Microsoft and IBM give weight to the specifications'..." See details in the news story "Enhanced Adobe XML Architecture Supports XML/PDF Form Designer and XML Data Package (XDP)."
[July 15, 2003] "Microsoft Bolsters Web Services Security." By Martin LaMonica. In CNET News.com (July 15, 2003). "Microsoft has released a toolkit designed to help software programmers tighten security in Web services applications. The toolkit, called Web Services Enhancements (WSE) version 2, will let companies use the latest security capabilities from Microsoft and other software giants like IBM and Sun Microsystems... Eventually, Microsoft will add the capabilities to its Visual Studio.Net development tool and the .Net Framework, the software 'plumbing' needed to run Web services applications on Windows operating systems. Microsoft is using the latest Web services security mechanisms even though the various specifications are likely to change, according to Microsoft executives. However, the toolkit introduces a programming technique that will allow software developers and administrators to establish security policies that can be altered without having to rewrite existing code. For example, a company could write a policy that would give network administrators access to corporate servers during working hours, but not after-hours. Using the policy authoring mechanisms in the WS-Policy and WS-SecurityPolicy, a developer can alter the policy without having to completely rewrite the application code, noted Rebecca Dias, product manager for advanced Web services at Microsoft..." See details in the news story "Enhanced Adobe XML Architecture Supports XML/PDF Form Designer and XML Data Package (XDP)."
[July 15, 2003] "Computer Associates Tackles Web Services Management. Tool Released for Discovery, Monitoring." By Brian Fonseca. In InfoWorld (July 14, 2003). "In an effort to overcome complexities associated with Web services management, Computer Associates on Monday introduced Unicenter Web Services Distributed Management (WSDM) 1.0, a tool designed to automatically discover and monitor Web services... For the monitoring of Web services within .Net environments and support of ASP .Net, CA introduced Unicenter Management for .Net Framework 3.0. The tool offers service-level reporting, health and performance reporting, and capacity utilization, said Dmitri Tcherevik, vice president and director of Web services at Islandia, N.Y.-based CA. Meanwhile, Unicenter Management for WebSphere Release 3.5 and Unicenter Management for WebLogic 3.5 work within J2EE to discover deployed Web services and their interfaces. Tcherevik said WSDM can analyze information about services, servers, and applications surrounding Web services to enable customers to either take corrective action or allow Unicenter's automated 'self-healing' capability to resolve the problem without human intervention. Supporting both the J2EE and .Net environments, WSDM offers services controls that allow users to disable, enable, or redirect Web services. The product monitors service characteristics of Web services transactions. In effect, one can use WSDM to automatically set alert thresholds and offer centralized management... CA announced the release of eTrust Directory 4.1. The product offers a UDDI implementation to support Web services, featuring the ability to store, replicate, and distribute vast amounts of Web services data." See details in the announcement: "CA Ensures Performance, Reliability and Security of Web Services With New Unicenter and eTrust Solutions. Five Advanced Management and Security Offerings Enable IT Organizations To Optimize Service Levels for Enterprise and Customer-Facing Systems."
[July 15, 2003] "CA Unveils Web Services Management Technology." By Steven Burke and Heather Clancy. In InternetWeek (July 15, 2003). "Computer Associates International has announced several new products for facilitating Web services, including Unicenter WSDM, a new management product designed to monitor and track Web services. The product, currently in beta, allows solution providers to quickly respond to lowered services levels or interruptions across servers and storage networks. 'I don't have to have access to someone else's infrastructure to monitor the services, yet I can manage it,' said Dmitri Tcherevik, vice president and director of Web services at CA. CA CTO Yogesh Gupta demonstrated Unicenter WSDM (Web Services Distributed Management) during his CA World keynote Tuesday in Las Vegas. CA Chairman and CEO Sanjay Kumar said CA will not actively recruit partners for Unicenter WSDM, but existing Unicenter channel partners will have the ability to sell and support the product. Tcherevik said Unicenter WSDM is expected to be generally available by the end of the calendar year. The vendor will use the beta program to establish pricing strategy but likely will use a tiered approach based on CA's current FlexSelect program, he said. The software can be deployed in a stand-alone fashion, although additional features are available to those that use it with the Unicenter management console..." See the announcement: "CA Ensures Performance, Reliability and Security of Web Services With New Unicenter and eTrust Solutions. Five Advanced Management and Security Offerings Enable IT Organizations To Optimize Service Levels for Enterprise and Customer-Facing Systems."
[July 15, 2003] "Adobe Expands E-Forms Push." By David Becker. In CNET News.com (July 15, 2003). "Publishing software giant Adobe Systems on Tuesday announced a new electronic forms application that appears to be aimed at Microsoft's upcoming InfoPath product. The as-yet unnamed product, which Adobe plans to introduce next year, allows companies to create and distribute interactive forms using Adobe's portable document format (PDF) and Extensible Markup Language (XML), the fast-spreading standard behind Web services. XML support means data from forms designed with the software can be automatically sucked into back-end software, such as corporate databases and customer relationship management (CRM) systems, eliminating the costly data re-entry associated with paper forms. Microsoft is touting similar advantages for InfoPath, a part of the dramatically revamped Office productivity software line the company plans to introduce in a few months. InfoPath, formerly code-named XDocs, designs XML-based forms and ties them in to back-end software to automate data exchange and delivery. Key differences in the Adobe product include reliance on PDF, a widespread format that can be read by any device equipped with the free Adobe Reader software. InfoPath forms can be used only by those who buy the application. 'We're combining the XML advantage with the best of PDF as far as document integrity and ubiquity,' said Marion Melani, senior product makret manager for Adobe's ePaper division. 'When you're conversing system to system, it's just an XML file. But the user gets the full PDF for the visual representation of that document.' InfoPath has the advantage of being tied to Microsoft's equally widespread Word word-processing application, said John Dalton, an analyst for Forrester Research. 'I have a feeling those (Word versus Adobe Reader) are almost red herrings in terms of advantages,' he said. The new Adobe software will also include simple tools for adding XML functions to existing PDF forms. Many financial, government and other institutions use PDF for electronic distribution of forms ultimately intended for printout..." See details in the news story "Enhanced Adobe XML Architecture Supports XML/PDF Form Designer and XML Data Package (XDP)."
[July 15, 2003] "Adobe Outlines XML Forms Plans." [Edited] By Patricia Evans. In The Bulletin: Seybold News and Views on Electronic Publishing Volume 8, Number 41 (July 16, 2003). "Seeking to counter the rumors that Microsoft's forthcoming XML forms product (now called InfoPath) means the death of PDF, Adobe this week departed from typical practice and discussed a product that's at least six months away from delivery. Adobe's forthcoming forms software will allow companies to create and distribute interactive forms using PDF and XML. Microsoft is touting similar features for InfoPath (formerly code-named XDocs), a part of the much-touted revamped Office suite the company plans to introduce in a few months. Adobe's product, which is slated to be released next year, relies on PDF, so forms can be viewed by anyone with the free Adobe Reader, whereas InfoPath forms can be used only by those who buy the application. PDF is also cross-platform, while Microsoft is tying its forms to its highly popular Word program and Windows platform. Adobe is working on form-design software that will include simple tools for adding XML functions to existing PDF forms. It features a universal client, which includes Adobe Reader, and also includes an intelligent document tier, which is where PDF documents with XML 'smarts' are created via the forms designer... InfoPath does not threaten the traditional uses of PDF in prepress and the graphic arts... But it very much threatens Adobe's plans to blend PDF and XML in electronic forms-the reason Adobe acquired Accelio, which used to go by the name of JetForms... By breaking tradition and preannouncing the product, Adobe has certainly landed a preemptive strike and alerted the market that when it comes to electronic forms, there will be more than one game in town..." See other details in the news story "Enhanced Adobe XML Architecture Supports XML/PDF Form Designer and XML Data Package (XDP)."
[July 15, 2003] "Liberty Alliance Offers Advice on External ID Federation. The Guidelines Explain How Companies Should Work Together on the ID Effort." By Scarlet Pruitt. In Computerworld (July 10, 2003). "Having already set forth the technical requirements needed to create a federated identity architecture, the Liberty Alliance Project released guidelines this week for how companies should include business partners and customers in their networks, saying it's crucial for the advancement of Web services. The group released the Liberty Alliance Business Guidelines document at the Burton Catalyst Conference in San Francisco on July 8, 2003 outlining how companies should ensure mutual confidence, risk management, liability assessment and compliance when considering wide-scale deployment of federated network identity. The guidelines come on the heels of the group's federated network-identity technical requirements, released last year, and the second set of recommendations, which is available for public review. The nonprofit group represents more than 170 companies and organizations working to develop and deploy open, federated network-identity standards. Members include companies such as Sun Microsystems Inc., SAP AG and American Express Co. The group's open standards for federated identity compete against Microsoft Corp.'s Passport service in the user authentication and identity management arena... The group claimed that extending access to customers, partners and suppliers is the next phase of Web services and advises companies to put processes in place that guard against losses due to identity fraud and leakage of information... The group is expected to release additional guidelines later this year..." See details in "Liberty Alliance Publishes Business Requirements and Guidelines for Identity Federation."
[July 15, 2003] "Microsoft Shifts TrustBridge Towards WS- Roadmap." By Gavin Clarke. In Computer Business Review Online (July 15, 2003). "Microsoft Corp is changing the direction of TrustBridge for Active Directory to fit the WS- roadmap jointly authored with IBM Corp, and is targeting a 2005 release date for the technology. TrustBridge has been re-worked to accept security tokens other than Kerberos, by adding support for WS-Security. TrustBridge will also support the related WS-Federation and WS-Trust among other WS- specifications. Microsoft has also put a rough date on TrustBridge's delivery. TrustBridge will ship in the "wave" of Longhorn, the company's next planned operating system due in 2005, according to XML services architect John Shewchuk. It is not year clear, though, whether TrustBridge will ship as a feature of Longhorn or a separate product, although Microsoft will offer developers greater access to WS- specifications with a second planned Web Services Enhancements (WSE) toolkit, due soon. TrustBridge was unveiled by Microsoft last July, but has not been spoken about publicly since. However, Microsoft lead product manager Michael Stephenson, speaking after last week's Burton Group's Catalyst 2003 conference, told ComputerWire: "We are shifting direction on directory to build TrustBridge on top of interoperable identity standards." As originally intended, TrustBridge would have enabled Active Directory to take a Kerberos security token and communicate with another Kerberos-based system, not necessarily Active Directory. Stephenson said, though, this required "proprietary" work..." See also "Microsoft Revives TrustBridge for Web Services Role," by John Fontana.
[July 15, 2003] "Serialize XML Data. Saving XML Data Using DOMWriter in XML for the C++ Parser." By Tinny Ng (System House Business Scenarios Designer, IBM Toronto Laboratory). From IBM developerWorks, XML zone. July 15, 2003. ['IBM developer Tinny Ng shows you how to serialize XML data to a DOMString with different encodings. You'll also find examples that demonstrate how to use the MemBufFormatTarget, StdOutFormatTarget, and LocalFileFormatTarget output streams in XML4C/Xerces-C++.'] "Xerces-C++ is an XML parser written in C++ and distributed by the open source Apache XML project. Since early last year, Xerces-C++ has added an experimental implementation of a subset of the W3C Document Object Model (DOM) Level 3 as specified in the DOM Level 3 Core Specification and the DOM Level 3 Load and Save Specification. The DOM Level 3 Load and Save Specification defines a set of interfaces that allow users to load and save XML content from different input sources to different output streams. This article uses examples to show you how to save XML data in this way -- how to serialize XML data into different types of output streams with different encodings. Users can stream the output data into a string, an internal buffer, the standard output, or a file... For more details please refer to the W3C DOM Level 3 Load and Save Specification and the complete API documentation in Xerces-C++..." See: "Document Object Model (DOM) Level 3 Core Specification Version 1.0" (W3C Working Draft 09-June-2003) and "Document Object Model (DOM) Level 3 Load and Save Specification Version 1.0" (W3C Working Draft 19-June-2003). The LS specification "defines the Document Object Model Load and Save Level 3, a platform- and language-neutral interface that allows programs and scripts to dynamically load the content of an XML document into a DOM document and serialize a DOM document into an XML document..." General references in "W3C Document Object Model (DOM)."
[July 15, 2003] "XML for Data: Reuse it or Lose it, Part 3. Realize the Benefits of Reuse." By Kevin Williams (CEO, Blue Oxide Technologies, LLC). From IBM developerWorks, XML zone. July 08, 2003. ['In the final installment of this three-part column, Kevin Williams looks at some of the ways you can take advantage of the reusable XML components that he defined in the previous two installments of this column. Designing XML with reusable components can, in many ways, create direct and indirect benefits; Kevin takes a quick look at some of the most important. You can share your thoughts on this article with the author and other readers in the accompanying discussion forum.'] "This column builds on the philosophy of XML reuse I described in the first two columns... The first benefit of using reusable components isn't necessarily a direct benefit of the design of XML structures that use components, but it is a natural outcome of the approach. To create components that can be reused, you need to capture solid semantics about those components. These semantics can be extended into the processing code itself to make the programmer's job easier... Another natural benefit of the component-based approach to XML design is the ability to reuse XSLT fragments to ensure a standardized presentation of information across many different documents. Again, this is a natural outcome of capturing good semantics and reusing elements and attributes whenever possible... Another benefit [Class-to-XML mapping (fragment serialization, deserialization)] begins to appear when higher-order elements are reused... By creating XML-aware classes, you can make it possible to reuse parsing and serialization code -- as long as you have properly reused the structures in your XML schemas... You'll find many benefits to designing XML schemas using reusable components. These benefits lead directly to shorter development cycles and simpler maintenance of code. If you are designing a large system with many different types of XML documents, taking the time to identify reusable components of those documents early in the development effort benefits that effort in the long term..." See Part 1 "XML Reuse in the Enterprise," which "looks at some of the historical approaches to reusing serialized data, and then shows how XML allows one to break from tradition and take a more flexible approach to document designs; Part 2 "Understanding Reusable Components" describes "the types of components that can be reused in XML designs and provides examples of each in XML and XML Schema."
[July 15, 2003] "Tip: Send and Receive SOAP Messages with SAAJ. Java API Automates Many of the Steps Required in Generating and Sending Messages Manually." By Nicholas Chase (President, Chase & Chase, Inc). From IBM developerWorks, XML zone. July 10, 2003. ['In this tip, author and developer Nicholas Chase shows you how to use the SOAP with Attachments API for Java (SAAJ) to simplify the process of creating and sending SOAP messages.'] "The foundation of Web services lies in the sending and receiving of messages in a standard format so that all systems can understand them. Typically, that format is SOAP. A SOAP message can be generated and sent manually, but the SOAP with Attachments API for Java (SAAJ) -- an offshoot of the Java API for XML Messaging (JAXM) -- automates many of the required steps, such as creating connections or creating and sending the actual messages. This tip chronicles the creation and sending of a synchronous SOAP message. The process involves five steps: (1) Creating a SOAP connection; (2) Creating a SOAP message; (3) Populating the message; (4) Sending the message; (5) Retrieving the reply. SAAJ is available as part of the Java Web Services Developer Pack 1.2. This package also includes a copy of the Tomcat Web server (so you can host your own service) and sample applications. Setting up the Java Web Services Developer Pack 1.2 is easy -- as long as you send your messages through the included Tomcat Web server. [Once installation is complete] you should be able to send a message from anywhere on your system using a standalone program... The simple application [developed in this article] just outputs the received message, but you can just as easily extract the information from the XML document. Also, while this tip demonstrates the synchronous sending and receiving of messages, the JAXM API, available as an optional download, allows for the use of a messaging provider for asynchronous delivery through the use of a ProviderConnection object rather than a SOAPConnection. The provider holds the message until it is delivered successfully. JAXM also allows for the use of profiles, which make it easy to create specialized SOAP messages such as SOAP-RP or ebXML messages..." [Note: 'The Java Web Services Developer Pack (Java WSDP) is an integrated toolkit that allows Java developers to build, test and deploy XML applications, Web services, and Web applications with the latest Web services technologies and standards implementations. Technologies in Java WSDP include the Java APIs for XML, Java Architecture for XML Binding (JAXB), JavaServer Faces, Web Services Interoperability Sample Application, XML and Web Services Security, JavaServer Pages Standard Tag Library (JSTL), Java WSDP Registry Server, Ant Build Tool, and Apache Tomcat container.']
[July 15, 2003] "Use RosettaNet Based Web Services, Part 1: BPEL4WS and RosettaNet. How to Instantly Add Years of E-Business Experience and Expertise to Your Web Services." By Suhayl Masud (Founder and Lead Architect, Different Thinking). From IBM developerWorks, Web services (July 15, 2003). ['While Web services are a gentle evolution of existing technology, they are a revolution in the way business can be represented in software. However, we cannot realize the full potential of Web services, or see their revolutionary nature, unless we start constructing partner-to-partner e-business dialogues that conduct real business transactions. This series of articles demonstrates the creation of a real e-business dialogue by leveraging the industry leading e-business process specifications from RosettaNet, and translating them to Web services using the expressive and flexible BPEL4WS.'] "The purpose of this series of articles is to demonstrate the true potential of Web services by creating an e-business dialogue that can be used to conduct real business. This e-business dialogue will be based on a real world business problem and the problem will be addressed by using a proven solution from RosettaNet. In this series, I will show you that the most important aspect of Web services is the e-business dialogue; I will explain what they are and how to construct them for business peers. In this first article in the series, I will cover the following: the true potential of Web services, understanding how to conduct e-business dialogues, advantages of leveraging RosettaNet, introduction to RosettaNet, and translating RosettaNet into Web services. In Parts 2 and 3, I will discuss choreography for Web services and construct a sample end-to-end e-business scenario that demonstrates the benefits of combining RosettaNet and BPEL4WS..." See also: (1) "Business Process with BPEL4WS"; (2) general references in "Business Process Execution Language for Web Services (BPEL4WS)" and "RosettaNet."
[July 14, 2003] "Sleepycat Boosts Database." By Lisa Vaas. In eWEEK (July 14, 2003). "Sleepycat Software Inc. last week tossed its open-source database into the XML ring with the release of code for Berkeley DB XML, a native XML database that's built on top of its open-source embedded database, Berkeley DB. Berkeley DB XML offers a single data repository for storage and retrieval of native XML and non-XML data, avoiding the XML conversion overhead that occurs with relational databases that have been retrofit with XML adapters, officials said. The database supports XPath 1.0, a World Wide Web Consortium standard language for addressing parts of an XML document. It offers flexible indexing, giving application developers the ability to control query performance and tune data retrieval... Having Berkeley DB as the base engine for the XML offering means that the new product will inherit advanced database features such as concurrent access, transactions, recovery and replication, officials said. It will scale up to 256 terabytes for the database and up to 4GB for individual keys and values... The release of the open source code heralds the end of a 12-month beta program that comprised some 5,000 companies, many of them huge names such as 3M Co., Amazon.com Inc., BEA Systems Inc., Lucent Technologies Inc.'s Bell Labs, The Boeing Co., Cisco Systems Inc., Hewlett-Packard Co., IBM and NEC America Inc. Those big names are testimony to the traction XML is gaining in the enterprise, said Sleepycat officials, in Lincoln, Mass. Sleepycat's software is sold using a typical open-source scheme: free to download and use or fee-based to ship a product whose source code is withheld. The company has 200 paying customers, according to officials..." See details in the news story "Sleepycat Software Releases Berkeley DB XML Native XML Database."
[July 14, 2003] "Using WSDL in a UDDI Registry, Version 2.0." By John Colgrave (IBM) and Karsten Januszewski (Microsoft). Edited by Anne Thomas Manes and Tony Rogers (Computer Associates). Technical Note produced by the OASIS UDDI Specification Technical Committee. Document Identifier: uddi-spec-tc-tn-wsdl-v2. 42 pages. According to a posting from Tom Bellwood Tom Bellwood (Co-Chair, OASIS UDDI Specification TC), this document was approved as a Committee Technical Note. Send comments to uddi-spec-comment@lists.oasis-open.org. Summary: "This document is an OASIS UDDI Technical Note that defines a new approach to using WSDL in a UDDI Registry." From the Introduction: " The Universal Description, Discovery & Integration (UDDI) specification provides a platform-independent way of describing and discovering Web services and Web service providers. The UDDI data structures provide a framework for the description of basic service information, and an extensible mechanism to specify detailed service access information using any standard description language. Many such languages exist in specific industry domains and at different levels of the protocol stack. The Web Services Description Language (WSDL) is a general purpose XML language for describing the interface, protocol bindings, and the deployment details of network services. WSDL complements the UDDI standard by providing a uniform way of describing the abstract interface and protocol bindings of arbitrary network services. The purpose of this document is to clarify the relationship between the two and to describe a recommended approach to mapping WSDL descriptions to the UDDI data structures. Consistent and thorough WSDL mappings are critical to the utility of UDDI... The primary goals of this mapping are: (1) To enable the automatic registration of WSDL definitions in UDDI; (2) To enable precise and flexible UDDI queries based on specific WSDL artifacts and metadata; (3) To maintain compatibility with the mapping described in the Using WSDL in a UDDI Registry, Version 1.08 Best Practice document; (4) To provide a consistent mapping for UDDI Version 2 and UDDI Version 3; (5) To support any logical and physical structure of WSDL description. This mapping prescribes a consistent methodology to map WSDL 1.1 artifacts to UDDI structures. It describes an approach that represents reusable, abstract Web service artifacts, (WSDL portTypes and WSDL bindings) and Web service implementations (WSDL services and ports). Tools can use this mapping to generate UDDI registrations automatically from WSDL descriptions. This mapping captures sufficient information from the WSDL documents to allow precise queries for Web services information without further recourse to the source WSDL documents, and to allow the appropriate WSDL documents to be retrieved once a match has been found. Given that the source WSDL documents can be distributed among the publishers using a UDDI registry, a UDDI registry provides a convenient central point where such queries can be executed... This document builds on Using WSDL in a UDDI Registry, Version 1.08, providing an expanded modeling practice that encompasses the flexibility of WSDL. The primary difference between this mapping and the one described in the existing Best Practice is that this mapping provides a methodology to represent individual Web services artifacts." References: (1) "Universal Description, Discovery, and Integration (UDDI)"; (2) "Web Services Description Language (WSDL)."
[July 14, 2003] "The Politics of Open-Source Software." By Declan McCullagh. In CNET News.com (July 14, 2003). "Mike Wendy, spokesman and policy counsel for the Initiative for Software Choice (ISC), says he just wants to make sure government agencies don't unduly favor open-source or free programs over proprietary software. 'We want a process that is not based on automatic preferences,' Wendy said. The ISC is by far the most vocal opponent of a growing trend: Legislation that, if enacted, would all but prohibit government agencies from purchasing proprietary software for their own use. The ISC asserts that such legislation could jeopardize the future of the worldwide commercial software industry. Because of the size of governments' ever-growing information technology budgets, billions of dollars are at stake. ('Open source' means that, at the very least, the source code is available, and 'free software' means that anyone who modifies the code may, if they distribute it, be required to disclose details of the modifications.) So far, the ISC says more than 70 such proposals have surfaced in U.S. state capitals and in about two dozen other countries. The reasons for the initiatives are complex and varied, but some governments have cited Microsoft's relatively expensive licensing terms. Other measures in some foreign countries are probably driven by issues such as anti-American sentiment... On Friday, Microsoft refused to disclose how much money, if any, it gives directly to the ISC. A Microsoft spokesman did tell me: 'Microsoft is a founding member and maintains a strong commitment to the ISC. This commitment is based on Microsoft's support of the initiative's belief that it is important to allow multiple software development, business and licensing models to compete on their merits and without government regulations that would seek to prefer one model over another'... Open-source evangelist Bruce Perens, on the other hand, says it's not that simple. He says the ISC is going beyond its 'neutrality' claim and is lobbying against open-source software. He's created a rival organization, Sincere Choice, to oppose it. 'I think that the reason they should be paid attention to is because they come out against stuff that is not a hard preference law,' Perens says. 'If they limited their objections to hard preference laws, there would be fewer problems with them...Hard preference bills are a red herring. They're very rarely offered seriously by any government. What's offered instead are bills that say, 'We'll consider open source.' The ISC comes in and says the next step is hard preferences. They paint a draconian scenario for something that isn't draconian.' Perens adds: 'Their message is not just that open-source preference laws are bad. They argue against open source in general'..."
[July 14, 2003] "Novell and Microsoft Embrace IDs." By Dennis Fisher. In eWEEK (July 07, 2003). "The decades-old rivalry between Microsoft Corp. and Novell Inc. is heating up again. This time, the companies are jockeying for position in a market that rapidly is becoming one of the key battlegrounds for winning and keeping customers: identity management. Novell, one of the established players in identity management, this week plans to unveil two additions to its already-large product portfolio in the space, Nsure Audit and a SAML (Security Assertion Markup Language) extension for iChain. The auditing piece addresses what customers say has been a glaring weakness in Novell's offerings -- that is, the ability to securely log and audit all user log-in activity on a system... Nsure Audit lets administrators obtain a record of any log-in transaction on their system and determine whether a user is violating security policy. The data is collected in a central log and can be used to send alerts to administrators to notify them of important events. Each log-in event is digitally signed, and any number of events can be strung together, all of which can help in forensic and nonrepudiation situations... The other new piece, the SAML extension for iChain, enables customers to pass user attributes among sites securely and map security assertions to individual user identities. The new offering also includes a SAML tool kit that can be used to add SAML capabilities to other applications. The Novell announcements come less than a week after Microsoft finally made its foray into the identity management arena with the release of MIIS (Microsoft Identity Integration Server) 2003. The offering includes a broad range of new capabilities, many of which rely on Active Directory and extend the functionality of the directory. But MIIS' main advancement is its ability to interoperate with third-party directories. The new server includes Directory Services Markup Language 2.0, which lets identity information be represented as XML data that can easily be used by other directories..." See: (1) "Microsoft Announces Release of Microsoft Identity Integration Server (MIIS) 2003"; (2) "Novell Helps Business Partners Securely Share Identity Information on the Web. Novell Ships the SAML Extension for Novell iChain, a Federated Identity Management Service."; (3) "Security Assertion Markup Language (SAML)."
[July 14, 2003] "Open-Source Groupware Aims at Exchange." By Mark Hachman. In eWEEK (July 11, 2003). "In the spirit of the OpenOffice assault on Microsoft Office, a group of independent developers has begun an open-source groupware project intended to provide an alternative to Microsoft's Exchange Server. According to the group, OpenGroupware.org was founded to create 'the leading open-source groupware server to integrate with the leading open-source office suite products and all the leading groupware clients running across all major platforms, and to provide access to all functionality and data through open XML-based interfaces and APIs.' The group said it intends to work with the OpenOffice.org developer team but maintain a separate hierarchy. OpenGroupware is not affiliated with any specific company. 'Just to be perfectly clear, this is an MS Exchange take-out,' wrote Gary Frederick, leader of the OpenOffice.org Groupware Project, in a statement. 'OGo is important because it's the missing link in the open-source software stack. 'It's the end of a decade-long effort to map all the key infrastructure and standard desktop applications -- including the Web server (Apache), the OS (GNU/Linux, the BSDs), the browser (Mozilla, Konqueror, Opera), the office suite (OpenOffice.org 1.1, KOffice, AbiWord), and the groupware applications (Evolution, KMail, Netscape/Mozilla Mail) -- to free software.' Subprojects by the OpenGroupware.org team will use either the General Public License (GPL) or a 'Lesser General Public License,' which allows users to modify, copy and redistribute certain libraries..." See (1) the announcement; (2) other details in the news story "OpenGroupware.org Announces Open Source Project for Groupware Server Software."
[July 14, 2003] "WS-I Discusses Interoperable Web Services." By Allison Taylor. In ComputerWorld Australia (July 14, 2003). "The speed and extent of Web services adoption depends on the success of making them interoperable, the president and chairman of the Web Services Interoperability Organization (WS-I) said at a roundtable discussion on Web services in Toronto last Thursday. Tom Glover and a slew of WS-I member companies including Microsoft Canada Corp., IBM Canada Ltd., Nortel Networks, NetManage Canada Inc., Hummingbird Ltd. and Cognos Inc., participated in the discussion which focused on the importance of interoperable Web services for the entire IT industry and how those standards should be made. The WS-I is a group of about 160 software companies working to identify Web services interoperability requirements and developing materials to address those needs. The groups that comprise the WS-I try to understand how Web services are used throughout the industry, try to understand the requirements and then, as a community, the groups attempt to resolve challenges, define services and define how those services behave at an infrastructure level, Glover said. By taking the resources within IT and working together, the WS-I hopes to create a set of standards to help everyone understand what Web services look like, he added. As an example of the importance of developing standards, Glover highlighted the battle between the BETA video tape versus VHS, which resulted in the market and public determining VHS as the winner. 'This battle is not the model we want for Web services. It's not efficient and it costs too much. We don't want the market penalized but we want Web services to be understood,' he said, adding that standards would ensure the market completely understands Web services. Phil Edholm, chief technology officer and vice-president of network architecture for Nortel Networks, said there is a great economic and productivity value in having interoperable Web services and as such, it's critical to Web Services to have the WS-I succeed'..." General references in "Web Services Interoperability Organization (WS-I)."
[July 14, 2003] "Creating Email Templates with XML." By Rafe Colburn. From O'Reilly ONJava.com (July 09, 2003). "One feature that seems to eventually creep into every web application is the ability to send email. Generally, it's a very specific kind of email, like a password reminder, welcome message, order confirmation, or receipt. Despite the fact that the content of these emails differs from application to application, the process of sending email rarely changes; you construct a message, give it to the mail server, and it gets delivered... When a password reminder is sent, the user's email address and password are usually fetched from some sort of repository where login information is stored. The subject and the body have to be merged with the data in the database, and have to be stored somewhere, as well. One of the great problems of application design in general is where to keep these sorts of strings. In many cases, such strings are stored in property files, which keeps them from littering your source code and makes localization easier. I've used this approach for storing templates for outgoing emails in many web applications, but unfortunately, it's quite flawed... One alternative approach is to use XML for the email templates, and that's the approach I'm going to discuss in this article. XML provides great flexibility in how you structure your templates and does not have the same strict formatting rules as property files do, so it's easier to maintain large strings. The main downside is that XML files can be trickier to deal with than property files. With a property file, it's easy to load the files and it's easy to access the properties once they've been loaded. On the other hand, it takes more work to load the XML file and process it using one of the many XML processing libraries provided for use with Java... This article and the accompanying code attempt to take as much pain as possible out of the process for you by providing a generic framework that enables you to create email templates using XML files and send them. In this framework, I'll be using the Commons Digester package from the Jakarta project for XML processing, and the JavaMail API do send the actual email..."
[July 14, 2003] "BEA Strong on WS-Security, Supports Liberty." By Gavin Clarke. In (July 10, 2003). "BEA Systems Inc's chief technology officer is steering a middle course on a jointly authored web services roadmap with IBM Corp and Microsoft Corp. Scott Dietzen urged customers to critically examine whether suppliers are committed to IBM and Microsoft's WS-Security specification, as he believes it is here to stay. Dietzen cautioned, though, work remained on WS-Security, adding the specification could learn a thing or two from the Liberty Alliance Project's activities. Speaking at the Burton Group's Catalyst Conference in San Francisco, California, Dietzen -- like IBM and Microsoft earlier in the week -- advocated convergence of Liberty with IBM and Microsoft's WS- roadmap of specifications. BEA is a member of Liberty but has provided input into WS- specifications via WS-Federation, announced last week. BEA also supports WS-Security and Security Assertion Markup Language (SAML), a cornerstone assertion used by Liberty, in products. Dietzen endorsed interoperability between Liberty and the WS-* stack but called convergence a 'good thing'. 'Our hope is we can converge these things around a single unified model. Not that there are competing stacks,' Dietzen said. He indicated, though, WS-Security remains some months from final completion. WS-Security is 'pretty close' with support for PKI, SAML, X.509, usernames and passwords, he said, but the specification requires 'more time to cook'..."
[July 14, 2003] "Thailand Links Up With Singapore Over ebXML." By Pongpen Sutharoj, Suchalee Pongprasert. In The Nation (July 14, 2003). "Thailand and Singapore are collaborating to set the standard of information interchange using ebXML to allow the two countries to do business by paperless processes. The move aims to encourage the exchange of data using ebXML in two main areas -- international trading and tourism. ebXML is the use of XML to standardise the secure exchange of business data. It is designed to create a global electronic market-place in which enterprises of any size, and in any location, can safely and securely transact business by the exchange of XML-based messages. Manoo Ordeedolachest, president of the Asian-Oceania Computing Industry Organisation (Asocio) said that the collaborating countries had set up an ebXML committee to oversee the development and define the standard of information exchange under a project called the XML industrial programme. Asocio is an ICT body which comprises 23 member countries, representing more than 10,000 ICT companies. He said that the committee had picked tourism as an industry model in the use of ebXML to do business between the countries. Singapore is working on the details of standards for data exchange. It is expected that standard will be completed before the annual Asocio meeting in Hanoi in November. In the tourism industry there are a large number of small and medium-sized businesses, but Manoo said that in the initial stage the data exchange would be used in the area of airline and hotel reservations as well as car rentals before expanding to other areas such as diving, yachts and restaurants. First adopted by Thailand and Singapore as a pilot project, the standard will be expanded to other Asocio member countries. He said that the standard would comply with the ebXML developed by the OpenTravel Alliance (OTA)... Manoo said the next step was for the committee to promote the use of the ebXML standard in international trade, allowing business to deal with government agencies paperlessly. Instead of submitting piles of documents for imports and exports, businesses could deal with, for example, the Customs Department electronically. The standard will be used as a foundation to further develop electronic government-related services in the two countries..." General references in "Electronic Business XML Initiative (ebXML)."
[July 12, 2003] "Burton Group Speakers Debate Web Services. Security Remains a Concern." By Paul Krill. In InfoWorld (July 11, 2003). "Speakers at the Burton Group Catalyst Conference 2003 debated the pros and cons of Web services-based application integration, illustrating the diversity of opinions still surrounding the technology... 'The reason that Web services will succeed is because we have total industry buy-in,' [Ann] Manes said. She also touted the Web Services Interoperability Organization's Basic Profile, saying it will provide for basic interoperability across any language or platform. Issues remain, however, such as scalability and performance, said Manes. She also stressed that users of Web services need to fit their systems with Web services management tools. But Web services will become part of the IT fabric, Manes said. 'Five years from now, you won't even think about Web services because it will become just part of your fabric,' in the same manner that enterprises use sockets, Manes said. The issue of security, long a concern with Web services, is being addressed via the WS-Security specification, she said. 'WS-Security is very close to being finished and it's a very strong specification,' Manes said. WS-Security is under the jurisdiction of the Organization for the Advancement of Structured Information Standards (OASIS). A Web services user speaking at the conference, however, expressed concerns with the current level of security in Web services. 'It's a miserable story and I think it's a reason why we don't do a whole lot outside our firewalls,' said David Cohen, a vice president of the technology architecture group at Merrill Lynch. 'To me, Web services is just a communications system,' Cohen said. 'It's just at a higher level of the network stack.' Merrill Lynch, however, is a 'huge champion of XML,' said Cohen. BEA Systems' Scott Dietzen, also CTO, stressed that Web technology providers need to get on the WS-Security bandwagon..." General references in "Web Services Security Specification (WS-Security)."
[July 12, 2003] "Microsoft Pitches Voice Specification. SALT Support Trumps Voice XML as Speech Server Sounds Return of Enterprise Voice." By Ephraim Schwartz. In InfoWorld (July 11, 2003). "Due for manufacturing release before mid-2004, the Microsoft Speech Server product will include a text-to-speech engine from SpeechWorks -- Microsoft's own speech-recognition engine -- and a telephony interface manager. The offering will also include middleware that is being designed in partnership with Santa Clara, Calif.-based Intel and Dallas-based Intervoice to connect the Microsoft product to an enterprise telephony infrastructure...The server's SALT (Speech Application Language Tags) voice browser sets Microsoft apart from the standards crowd. Rather than adhering to VXML (Voice XML) -- the current W3C standard for developing speech-based telephony applications -- Speech Server is compatible only with applications that use the specifications developed by the SALT Forum, of which Microsoft is a founding member... The SALT specification was originally targeted at the multimodal market for browsing the Web on handheld devices. The theory was that users required multiple ways to interface with smaller devices and that voice would be chief among them, but the market for multimodal handhelds has not materialized... Bill Meisel, a principal at TMA Associates, a leading speech technology research company based in Tarzana, Calif., said enterprise voice adoption will increase due to Microsoft's market influence. Yet, because Speech Server will compete directly with established VXML applications, Microsoft's actions will make speech technology adoption a more complex exercise for the enterprise, according to Meisel. Competing speech technology vendor IBM is a case in point. Big Blue supports VXML and the W3C standard, according to Gene Cox, director of mobile solutions at Armonk, N.Y.-based IBM. Cox said significant VXML applications already exist in the enterprise at companies such as AT&T, General Motors' OnStar division, and Sprint PCS. 'VXML conforms to all W3C royalty-free polices. But SALT is like Internet Explorer; it is free as long as you buy Windows,' Cox said. The debate over which technology to use will not be fought out at the customer level, said Forrester's Herrell, but rather by developers. Irvine, Calif.-based NewportWorks, an information service provider for the real estate industry, is one example of an IBM customer that will be hard to shift away from Voice XML. According to CEO Ken Stockman, the company could not exist without Voice XML. NewportWorks aggregates the data from the MLS (Multiple Listing Service), uses IBM's WebSphere Speech Server to convert the listings for voice access, and sells the service to real estate agencies... Stockman said the learning curve on VXML for developers was negligible. Microsoft, on the other hand, argues that Web developers don't want to learn a new language. Instead, they want SALT tag plug-ins for existing Web-based applications. According to Intervoice, the argument may be resolved through tools such as its Invision, which allows a developer to automatically generate VXML and to possibly generate SALT code in the future..." See details in "Microsoft Enhances Support for Speech Application Language Tags (SALT)."
[July 12, 2003] "Microsoft Plans Web Services Spec Meeting. WS-ReliableMessaging to be Scrutinized." By Paul Krill. In InfoWorld (July 11, 2003). "Microsoft on Tuesday at its Redmond, Wash., campus plans to hold a summit meeting to seek input on the Web Services Reliable Messaging (WS-ReliableMessaging) specification for Web services-based integration, according to sources familiar with Microsoft's plans... Microsoft and co-developers IBM, Tibco, and BEA Systems in March announced publication of WS-ReliableMessaging. It provides a protocol whereby Web services messages that are un-received or are duplicates can be detected, while messages that are received can be processed in the order in which they were sent. The specification has yet to be submitted to an industrywide standards organization, such as OASIS or the World Wide Web Consortium (W3C), which have been reviewing a wide array of Web services-related proposals. Sun Microsystems, for one, has been critical of IBM and Microsoft for what Sun claims have been attempts to promote in-house specifications as industry standards without going to a standards body... The issue was brought up again during a panel session at the Catalyst conference on Friday, when Microsoft and IBM officials were asked why they have not submitted the proposal to a standards body. But the officials defended their stance. 'We have a process we go though. In that particular specification, we're currently getting feedback, and at the right time we'll make it happen,' said Angel Luis Diaz, manager of Web services product management at IBM. No decision has been made on which standards body would receive the specification for consideration as an industry standard, Diaz said..." A 2003-07-06 message from Colleen Evans clarifies the purpose of the July 15, 2003 WS-ReliableMessaging workshop. Note: the Reliable Messaging Specification Workshop Agreement posted to the OASIS WSRM TC clarifies the intent of the WS-ReliableMessaging authors to provide RF license(s) for the spec when it is submitted to a standards body. The announcement says, in part: "Authors of the WS-Reliable Messaging (March 13, 2003) specification (the 'Specification') are hosting a one-day meeting on July 15, 2003 to discuss the Specification and general thoughts about the problem spaces addressed. This Workshop is an ad-hoc, open forum for 1) Specification authors to share background information on the design of the specification and to receive feedback and 2) software vendors to discuss their ideas about the practicality of implementing the Specification. The authors of the Specification intend to submit a revised version of the Specification to a standards body with a commitment to grant a royalty-free license to their necessary patents. We need assurance that your feedback and discussions are consistent with that goal..." See the posting of Colleen Evans via Mark D. Hansen. References: (1) "New Web Services Specifications for Reliable Messaging and Addressing"; (2) OASIS Web Services Reliable Messaging TC; (3) "Reliable Messaging." [ZIP source, cache]
[July 12, 2003] "Provisioning the Future at the Catalyst Conference." By Anne Chen. In eWEEK (July 11, 2003). "The Catalyst Conference, which focused solely on directory services just four years ago when I attended my first one, has become the place for identity management announcements and demonstrations of specifications and their interoperability. The conference, held in San Francisco this year, did not disappoint. While identity management was the headliner at this week's conference, the demonstration of the OASIS Service Provisioning Markup Language (SPML) could have a larger impact on enterprises in the near future... At the conference, ten members of OASIS publicly demonstrated the interoperability between security software products using SPML, as well as the specification's stability, for the first time... SPML is the XML-based framework for exchanging and administering user access rights and resource information across heterogeneous environments. In essence, the specification is designed to create interoperability across provisioning systems so they can talk to one another. Other applications could also use the specification to request provisioning commands... If you're having trouble selling provisioning to your CIO, consider the method used by Tom King, chief information security officer at Lehman Bros. King, who presented a case study on provisioning, said he manages the log-ins for 15,000 employees with 230,000 systems accounts. In the past, it cost Lehman Bros. $33 every time an account had to be changed. So although the costs of deploying a provisioning system were high, he said it only took one statement to get his CIO to sign off on the project: The ability to know what each employee is accessing is priceless..." See "OASIS Member Companies Host SPML Identity Management Interoperability Event."
[July 12, 2003] "Security This Week: Identity, Privacy, And Spam 'Sucker' Lines. IBM, BMC Bring Security Technologies to the Fore." By Paul Roberts. In InfoWorld (July 11, 2003). "With the U.S. markets in low gear following the July 4 holiday, this week saw empty cubicles, crowded beaches and a hodgepodge of IT security news. The Burton Group Catalyst Conference in San Francisco brought a spate of announcements from companies operating in the hot area of identity management and data privacy... Thor Technologies announced the availability of Xellerate adapters for Oracle's Internet Directory, a user identity repository for the Oracle9i platform. The adapters will enable companies using Oracle products such as Oracle9i Application Server, E-Business Suite or Collaboration Suite together with third party products to synchronize user identity and resource provisioning among the different applications, Thor said... IBM used the Catalyst show to introduce EPAL, the Enterprise Privacy Authorization Language, an XML (Extensible Markup Language) derivative that translates business-to-customer privacy protections for applications and databases within an enterprise... BMC Software, unveiled enhancements to Control-SA, the Houston company's user-provisioning product. Control-SA will now support user provisioning based on the SPML (Service Provisioning Markup Language) open standard, developed by Organization for the Advancement of Structured Information Standards, BMC said. SPML enables different organizations using heterogenous provisioning products within a supply chain to provision employees on each others' systems. In addition to supporting SPML, BMC announced a new interface for the Control-SA product that allows the product to take better advantage of LDAP (Lightweight Directory Access Protocol) connectivity. The changes will allow customers to use LDAP to access identity and security information managed by Control-SA, BMC said. Also on the identity management front, Entrust on Tuesday pulled the covers off Entrust GetAccess 7.0, the access control component of Entrust's Secure Identity Management Solution software. The new release performs at double the speed of earlier GetAccess releases, as well as streamlined deployment and management..." See: (1) "BMC Software Enhances Provisioning Solution. Marked By Industry First, Enhancements Ease Provisioning Management Challenges."; (2) "IBM Releases Updated Enterprise Privacy Authorization Language (EPAL) Specification."
[July 11, 2003] "Microsoft Releases Speech Server Beta. Company Looks to Integrate Call Centers With Web." By Stephen Lawson. In InfoWorld (July 10, 2003). "Microsoft on Wednesday moved toward the integration of call centers and the Web with the release of the first public beta of its Microsoft Speech Server and a new beta version of its Speech Application Software Development Kit (SDK). The software platform is designed to host voice-based services similarly to the way Web servers host a company's Web site, as well as supporting 'multimodal' applications that take advantage of both voice and Web interfaces. It is based on SALT (Speech Application Language Tags), an extension of current scripting languages including HTML (Hypertext Markup Language) and XML. Companies that need call centers can cut costs by automating them on the server, said Xuedong Huang, general manager for Microsoft speech technologies. Among other things, the server can interpret callers' requests and provide recorded or synthesized responses. Developers also can integrate the voice-based services with Web-based applications that can continue to run on a Web server as they do now. For example, a caller could ask for a stock quote verbally and have it displayed on a handheld device, he said. The beta version of the server can deliver voice-only services to a wired phone and multimodal services to any device with a screen that uses either a wired or a IEEE 802.11 wireless LAN connection to the server. Other wireless technologies will be supported later, Huang said. The software includes a speech recognition engine for handling users' speech inputs and a prompt engine to bring up prerecorded prompts from a database to play for users. It also has a text-to-speech engine that can synthesize audible prompts from a text string when a prerecorded prompt is not available. In addition, it has a SALT Interpreter and other components to support services to callers... The SALT Forum has submitted SALT 1.0 as a specification to the World Wide Web Consortium (W3C). The group has more than 70 members, including founding members Microsoft, Cisco Systems, Intel, Philips Electronics, SpeechWorks International and Comverse, Huang said. SALT is a more lightweight extension of current markup languages than is Voice XML, a specification being used by many voice-based services developers today, according to Mark Plakias, an analyst at Zelos Group Inc., in San Francisco. As a result, it allows companies to draw upon a larger pool of developers than does Voice XML, which is more familiar to developers of traditional IVR (integrated voice response) systems, he said..." See details in "Microsoft Enhances Support for Speech Application Language Tags (SALT)."
[July 11, 2003] "Meeting the Challenge: Web Services Technologies and Business Value." By Bob Sutor (IBM, Director of WebSphere Infrastructure Software). From WSReview.com (July 10, 2003). Focus: XML. ['Believe it or not, Web services can be defined without mentioning specific operating systems, programming languages, tools, databases or CRM systems. This article describes the business requirements that Web services are being designed to meet, and it may uncover places in your business where Web services are now or will soon be applicable.'] "... First of all, we start with an application that you want others to use. That is, you have a piece of software that initiates or accepts business transactions, provides or updates enterprise information, or perhaps manages the very systems and processes that make your business run. You may want to make this accessible to people in other parts of your organization, or a business partner, or a supplier, or a customer. We're really thinking here about software-to-software communication rather than the person-sitting-at-a-browser-talking-to-server-software situation, though it turns out that Web services can be used there as well... Web services reduces the complexity by bringing down the number of choices to a few manageable, best-of-breed technologies that can be used over and over again in different combinations. This means less training for your IT staff, greater efficiency in building new applications and significantly greater and faster interoperability... As Web services get increasingly deployed as part of what IBM calls 'e-business on demand,' we'll see greater automation and software that can dynamically locate and use the services they need. One of the best examples I can think of for this is that I may maintain a registry of my known suppliers with whom I've negotiated deals. The registry contains descriptions of Web services provided by my suppliers that allow me to request quotes as well as place and track orders. With this in place, I can automate a large portion of my procurement process, including negotiating with multiple suppliers to get what I need at the best price. It's easy to add new suppliers to such a system... Web services provides standards for an electronic envelop, a language for describing how you talk to a service and what it says back, plus techniques for publishing and discovering these descriptions. These features alone provide a tremendous amount of leverage and efficiency. Via these we can simplify the software development process, the communication infrastructure and the maintenance of our systems. The standards for these are well underway and some products like IBM's WebSphere are already in their second generation of support for them..."
[July 11, 2003] "Computer Associates to Hand Off Open Security Exchange to IEEE. Move Makes Organization More Public, Company Says." By Paul Roberts. In InfoWorld (July 10, 2003). "Three months after launching a cross-industry group to develop standards for integrating physical and information technology (IT) security, Computer Associates International is handing off management of that group to the Industry Standards and Technology Organization (ISTO). The ISTO, which was spun off of the Institute for Electrical and Electronics Engineers Inc. (IEEE) in 1999, will assume administrative control of the Open Security Exchange (OSE), providing staff and resources to manage the finances and logistics of the group, according to Greg Kohn, director of industry programs at ISTO... CA has faced criticism over the makeup of the OSE. Detractors complained that the absence of any other software companies in the group made the OSE little more than a CA partnership program rather than an independent industry standards group. Speaking on Wednesday, Moritz acknowledged those criticisms. 'By moving (OSE) under the IEEE we're getting an acknowledgement that OSE is more broad than OPSEC (Open Platform for Security partner program) from Check Point -- that it's a broad market initiative and not just a CA thing,' he said. Under IEEE-ISTO guidance, software companies with an interest in participating can join the OSE effort, as well as hardware and physical control companies and enterprises with an interest in investing in the technology produced from OSE standards, Moritz said. IEEE-ISTO will help attract new members by being a central reference point for questions about the group and by helping with outreach, Kohn said... CA and OSE members scouted out various standards organizations before deciding to hand over control of the OSE to the IEEE-ISTO, Moritz said. The Organization for Advancement of Structured Information Standards (OASIS) and World Wide Web Consortium (W3C) were both considered, he said. IEEE-ISTO emerged as the best fit, Moritz said. The group's unique mission and legal status makes the IEEE-ISTO attractive to corporations that want to work on developing industry standards, according to Kohn. Unlike OASIS or the W3C, IEEE-ISTO takes a more hands-off approach to managing its standards groups, allowing them to set their own membership rules, organizational structure and time table for delivering specifications. Other organizations are more likely to impose their own structure on member groups, he said..." See: OSE website
[July 10, 2003] "New Group Releases Open Source Groupware. Project Called a Microsoft Exchange Replacement." By Paul Krill. In InfoWorld (July 10, 2003). "OpenGroupware.org (OGo) on Thursday [2003-07-10] announced its formation and release of groupware server software under an open source format. An official at OpenOffice.org, a complementary project for building an open source office suite, calls the OGo project a replacement for Microsoft Exchange. OGo's free software provides server components for office collaboration with the OpenOffice.org suite and various Linux and Windows groupware clients, according to OGo, whose software runs on Linux and Solaris. The groupware provides document-sharing capabilities for OpenOffice.org documents and enables collaboration between users of packages such as Microsoft Outlook, Ximian Revolution, Mozilla calendar, and Glow, which is OpenOffice.org Groupware Project's client product... [Team lead Gary] Frederick in his statement said OGo is offering users a free solution for collaboration and document management that will 'far surpass the quality and level of collaboration found in Windows [through integration of Microsoft Office, Exchange Server, and SharePoint.]' OGo is licensed under the open source dual licenses, Lesser General Public License (LGPL) and the General Public License (GPL). OGo supports XML-based APIs, including XML-RPC. OGo uses a WebDAV-accessible relational database management system to make document storage accessible from the OpenOffice.org office suite, OGo said. OGo seeks to complement OpenOffice.org, which is building an international office suite that will run on major platforms and provide access to functionality and data through open-component based APIs and an XML-based file format, according to information culled from the two groups' Web sites..." See details in the news story "OpenGroupware.org Announces Open Source Project for Groupware Server Software."
[July 10, 2003] "Identity Management Front and Center At Catalyst." By Gregg Keizer. In TechWeb News (July 10, 2003). "Identity management got a series of shots in the arm on Tuesday as several companies made announcements at the Burton Group's Catalyst conference in San Francisco. Novell, which has been pushing hard into the identity management arena as it's seen its network software business languish, released Nsure Audit, a tracking application that companies can use to monitor all user log-in activity on a network, and the capping piece of its Nsure identity management platform... On a same playing field, Microsoft, IBM, and others used the Catalyst platform to announce a new Web services security specification. Called WS-Federation, the specification is the latest in a series of moves going back to 2002 in which the two companies have led efforts to define security, identity management, and trust standards in Web services... [Burton Group's Jamie] Lewis characterizes WS-Federation as a general purpose security specification, one that provides a 'way to exchange security tokens of any kind.' Products that comply with the spec, he said, will be able to authenticate users in transactions between different companies, business partners, for instance, which use different types of identity and authentication processes... The Liberty Alliance, another player in the security and identity management market, also made news Tuesday by releasing a set of guidelines it said would further the adoption of federated identity. The guidelines, which are available from the Liberty Alliance Web site, include a list of business requirements that the Alliance, which, like Microsoft and IBM, has offered up several Web services security and identity management specifications, believes essential before federated identity can bear fruit... Novell also announced it has posted a free SAML extension to its iChain identity management service. The extension can be downloaded from Novell's Web site..."
[July 10, 2003] "A Survey of APIs and Techniques for Processing XML." By Dare Obasanjo. From XML.com (July 09, 2003). "In recent times the landscape of APIs and techniques for processing XML has been reinvented as developers and designers learn from their experiences and some past mistakes. APIs such as DOM and SAX, which used to be the bread and butter of XML APIs, are giving way to new models of examining and processing XML. Although some of these techniques have become widespread among developers who primarily work with XML, they are still unknown to most developers... This article provides an overview of the current landscape of techniques for processing XML and runs the gamut from discussing old mainstays, such as push model APIs and tree model APIs as exemplified by SAX and DOM, to newer participants in the XML world such as cursor APIs and pull model parsers as exemplified by the .NET Framework's XPathNavigator and the XmlPull API respectively... In a push model the XML producer (typically an XML parser) controls the pace of the application and informs the XML consumer when certain events occur. The classic example of this is the SAX API, where the XML consumer registers callbacks with the SAX parser, which invokes the callbacks as various parts of the XML document are seen. The primary advantage of push model APIs when processing XML is that the entire XML document does not need to be stored in memory, only the information about the node currently being processed is needed. This makes it possible to process large XML documents which can range from several megabytes to a few gigabytes in size without incurring massive memory costs to the application. During pull model processing, the consumer of XML controls the program flow by requesting events from the XML producer as needed instead of waiting on events to be sent to it.... Like push model parsers, pull model XML parsers operate in a forward-only, streaming fashion while only showing information about a single node at any given time. This makes pull-based processing of XML as memory efficient as push-based processing but with a programming model that is more familiar to the average programmer... A tree-based API is an object model that represents an XML document as a tree of nodes. The object model consists of objects that map to various concepts from the XML 1.0 recommendation such as elements, attributes, processing instructions and comments. Such APIs provide mechanisms for loading, saving, accessing, querying, modifying, and deleting nodes from an XML document. XML cursors are the newest class of APIs for processing XML. An XML cursor acts like a lens that focuses on one XML node at a time, but, unlike pull-based or push-based APIs, the cursor can be positioned anywhere along the XML document at any given time... Just like tree model APIs, an XML cursor allows one to navigate, query, and manipulate an XML document loaded in memory. However, an XML cursor does not require the heavyweight interface of a traditional tree model API, where every significant token in the underlying XML must map to an object... Object to XML Mapping APIs [are used because] it is often convenient to map the contents of an XML document to objects that better represent the data within the XML document than interacting with the data via an XML-based object model. Developers working in object oriented languages typically prefer working with information as objects and classes as opposed to attempting to extract information from untyped XML nodes... This article shows that processing XML isn't simply a choice of in-memory (DOM) versus streaming (SAX). Rather, it's a tradeoff between a number of choices with lots of small nuances..."
[July 10, 2003] "Liberty Alliance Cites Business Issues in Network Identity. Group Publishes Guidelines." By Paul Krill. In InfoWorld (July 10, 2003). "The Liberty Alliance, in a document published this week, is emphasizing that business issues, not just technology, affect federated network identity. Released at the Catalyst conference in San Francisco, Liberty Alliance in its Business Guidelines document cites four business requirements to consider in the context of identity federation... Laws such as the Sarbanes Oxley regulations pertaining to public companies and audit trails also factor into the equation with network identity, said Eric Norlin, an editor of the Business Guidelines document and director of communications at Ping Identity, in Denver. "Obviously, the game and the stakes are getting greater and greater now with the legislative requirements," Norlin said. Liberty Alliance plans to introduce future documents pertaining to major business issues and information sources to guide federated identity implementations in vertical industries such as health care and financial services. The next set of documents is due in late-2003, according to Liberty Alliance. The organization, which features member companies such as Hewlett-Packard, Sun Microsystems, SAP and American Express, focuses on developing specifications for federated identity, which involves the verification of identities of parties in e-commerce across the Internet. The real value of Web services will not be reached until companies can more securely and efficiently manage trusted relationships among partners, according to Liberty Alliance... Liberty Alliance on Tuesday also said that the Financial Services Technology Consortium (FSTC), in a report, concluded that the Liberty Alliance specifications and the OASIS Security Assertion Markup Language (SAML) proposal provide financial institutions with a standardized way to extend trusted relationships with customers and employees to third parties..." See details in "Liberty Alliance Publishes Business Requirements and Guidelines for Identity Federation."
[July 10, 2003] "IBM, Microsoft Publish Web Services Identity Specification." By John Fontana. In Network World (July 08, 2003). "IBM and Microsoft on Tuesday published the fifth of an eventual seven specifications that will work in unison to help corporations deploy secure and interoperable Web services. The new WS-Federation specification is designed to standardize the way companies share user and machine identities among disparate authentication and authorization systems spread across corporate boundaries. RSA Security, BEA Systems and VeriSign helped the two vendors develop the specification. WS-Federation is the latest milestone in a roadmap IBM and Microsoft unveiled in April 2002 that introduced WS-Security as a foundation security protocol and six supporting protocols for building a Web services security framework... WS-Federation is the fifth published specification of the seven security protocols described in the roadmap. The last two, WS-Privacy and WS-Authorization, are due by the end of the year, according to IBM officials. The WS-Federation specification has three functional parts, including the Web Services Federation Language, which defines how different security realms broker identities, user attributes and authentication between Web services. The WS-Federation specification also includes Passive Requestor Profile, which describes how federation helps provide identity services to Web browsers and Web-enabled cell phones and devices; and Active Requestor Profile, which does the same for applications based on the Simple Object Access Protocol (SOAP) and other smart clients... IBM and Microsoft on Wednesday will present an interoperability demonstration using WS-Federation at this week's Burton Group Catalyst Conference in San Francisco. The demonstration will knit together an IBM identity platform including IBM's Tivoli Access Manager, IBM Directory Server and IBM WebSphere Portal -- all running on Linux with a Microsoft-centric identity platform, including Active Directory, BizTalk Server and .Net Framework... IBM and Microsoft hope to collect feedback from end users and independent software developers before eventually submitting the protocol to a standards body. The two said the process would follow a similar path as WS-Security, which went from published specification to OASIS submission in five months. IBM believes the WS-Security specifications, including WS-Federation, are beginning to line up with work being done by the Liberty Alliance, which is focused on creating a framework for federated identity management. The Liberty Alliance and the IBM/Microsoft juggernaut, however, remain on separate development paths, although the Liberty Alliance has incorporated WS-Security into its second-generation specification. Experts say Liberty and IBM/Microsoft need to combine forces at some point to create a single identity management framework. Neither IBM nor Microsoft are members of the Liberty Alliance..." See details in "Web Services Federation Language Provides Federated Identity Mapping Mechanisms."
[July 10, 2003] "Liberty Answers Microsoft Evangelism." By Gavin Clarke. In Computer Business Review Online (July 10, 2003). "The Liberty Alliance Project has insisted its specifications for federated network identity are modular and flexible, contrary to the claims of Microsoft Corp... Representatives for the organization spoke after IBM and Microsoft spent Wednesday morning evangelizing for WS-Federation, the latest milestone in the companies' WS- roadmap for secure, reliable web services messaging and identity. IBM and Microsoft demonstrated interoperability of WS-Federation at the Burton Group's Catalyst Conference in San Francisco, California... WS-Federation is part of a Microsoft and IBM roadmap that includes WS-Security, WS-Trust, WS-Reliable Messaging, WS-Transaction, WS-Co-ordination and WS-Policy. The companies have developed specifications, some of which still await submission to standards groups, with partners like Verisign Inc... Commenting on delivery of WS-Federation, Microsoft XML architect John Shewchuck told Catalyst delegates the companies had completed 'all the major pieces to build secure, reliable communications' for federated identity-based systems. He said the WS- specifications provide flexibility because they will work with customers' existing security assertions including SAML, Kerberos, X509, XRML, custom tokens, passwords and user names. Liberty, he pointed out, is based only on SAML. The WS- specifications also use multiple transport mechanisms, while Liberty uses Simple Object Access Protocol (SOAP). As such, Microsoft and IBM believe Liberty is a 'single solution' that should become an application leveraging the WS- roadmap, while the WS- technologies utilize Liberty's work on business and user requirements. Countering Microsoft, though, Liberty pointed out while Liberty it is based on SAML, SAML assertions can be written either by Liberty, customers or ISVs themselves that support existing security mechanisms such as Kerberos, PKI and passwords. Additionally, Liberty said there are no restrictions in its architecture that prevent the use of SMTP or any other transport mechanisms beyond the currently supported SOAP. 'We provide a modular and extensible architecture,' said Liberty alliance board member Larry Abrahams. He added that the Liberty's specifications were developed in response to the demands of customers -- a strong constituency in Liberty's membership -- to help ensure vendors who adopt the specifications are able to interoperate with each other... 'The whole point of Liberty is protocol and standards interoperability. It will let companies choose from vendors [products] with a greater degree of certainty that the specifications interoperate,' Abrahams said..." See: (1) "Liberty Alliance Publishes Business Requirements and Guidelines for Identity Federation"; (2) "Web Services Federation Language Provides Federated Identity Mapping Mechanisms."
[July 09, 2003] "Transclusion with XSLT 2.0." By Bob DuCharme. From XML.com (July 09, 2003). "Transclusion is a hypertext concept that began in the work of Ted Nelson, who coined the term 'hypertext'. Roughly speaking, transclusion is the inclusion of a resource, or part of a resource, potentially from anywhere in the world, within a new one. For example, the HTML img element is a form of transclusion. Nelson envisioned dynamic compound documents consisting entirely of pointers to pieces of other documents, with the compound ones automatically reflecting updates to the transcluded pieces. As he wrote in his book Literary Machines 93.1: 'Transclusion will be a fundamental service of tomorrow's literary computers, and a property of the documents they will supply. Transclusion means that part of a document may be in several places -- in other documents beside the original -- without actually being copied there.' Of course, at some level, information from one server must be copied from the transcluded document to show up on the screen of the user viewing the transcluding document; but if the copy happens at read time, every time, a compound document will still have the dynamic nature that Nelson envisioned. Has transclusion been implemented in any widely-used web technology? Transclusion-like capabilities are specified in the XInclude Candidate Recommendation, but this spec tells us that 'Simple information item inclusion as described in this specification differs from transclusion, which preserves contextual information such as style.' Even then, no XInclude implementation that I know of allows the inclusion of portions of other documents; utilities such as XIncluder only allow for the transclusion of entire documents. The XLink actuate="onLoad"/show="embed" combination describes something similar, but to my knowledge browser support for it is nonexistent... The Simplified Stylesheet Modules feature of XSLT 2.0 lets you embed an XSLT template rule in a document; if the embedded template rule calls XSLT 1.0's document() function and passes it the name of the document to transclude, the input document passed to the XSLT processor can be a dummy document... When the XSLT processors built-in the major browsers implement XSLT 2.0 as they now implement 1.0, we'll have a lot of great new possibilities to work with, including the ability to include a remote document..." See related references in "XML Linking Language."
[July 09, 2003] "O'Reilly Gazes Into the Future of Open Source." By Peter Galli. In eWEEK (July 09, 2003). "There has been an enormous paradigm shift around open source, Tim O'Reilly, president of O'Reilly & Associates Inc., said in a keynote address here on Wednesday. Addressing hundreds of attendees at the O'Reilly Open Source Convention (Oscon) here, he said that the shift started when IBM introduced the first PC, with change in the PC industry now driven by low-cost hardware and the commodity model. Software has also become decoupled from the hardware, resulting in a power shift in the PC industry toward software firms and seeing Microsoft emerge as the most powerful company in the computer industry, he said. The new rules governing the Internet paradigm shift are based on the fact that an open architecture inevitably leads to interchangeable parts; competitive advantage and revenue opportunities move 'up the stack' to services above the level of a single device; information applications are decoupled from both hardware and software; and lock-in is based on data and not on proprietary software, he said. 'The deep trends shaping the future of all software can be summarized by three-C's: software is becoming a commodity, it is being customized by users, and we are seeing network-enabled collaboration,' O'Reilly said... In his keynote address, Paul Buck, the director of IBM's Eclipse development, said Eclipse is a response to the fact that tools from different companies have not traditionally worked well together -- in fact tools from the same company often have not worked well together. 'We were as guilty of that as anyone, and realized that developers have better things to do than integrate tool sets,' he said. The goal for Eclipse, which IBM describes as an open-source tool framework for the enterprise, is to be a highly extensible platform with out-of-the-box solutions that allow developers to start building applications. Eclipse is platform-centric rather than tool-centric, and gives users more control as it allows the seamless integration of tools, to which new ones could be added..."
[July 09, 2003] "OASIS Takes the Wraps Off SPML." By Michael Singer. In InternetNews.com (July 09, 2003). "With a couple keystrokes, a new employee is added to a HR system at one company. But in a simultaneous message blast, that same employee gets LDAP and other accounts established at several other companies. While the technique is currently used through the wisdom of proprietary agents, adaptors and connectors, an industry standards consortia Tuesday showed how it could be done using only an open standards-based variation of XML. Members of OASIS at the Burton Group's Catalyst conference demonstrated the interoperability of service provisioning using its latest specification: Service Provisioning Markup Language (SPML). The technology was designed to work with the World Wide Web Consortium's SOAP, the OASIS Standard SAML, the OASIS WS-Security specification, and other open standards. Championed by companies like BMC Software, PeopleSoft, Novell, BEA, Business Layers, Entrust, OpenNetwork, Waveset, Thor Technologies, TruLogica, and Sun Microsystems, the spec lets companies that wouldn't normally talk with each other provide external access to sensitive data and corporate systems while maintaining secure, federated identity management. 'IT management is in the buy cycle and at the core is the identity. The question is how does the identity get there and how do I subscribe to that identity,' OASIS member and SPML chairman Darran Rolls told internetnews.com. 'SPML is very much about managing accounts or subscriptions and this provides a standards-based model.' The comparison there being previous protocols sometimes made it difficult to communicate because each vendor had to come up with its own version of how to deal with translating the documents. In addition to HR departments being able to coordinate with outside vendors, like a 401k management firm, Rolls says SPML could easily provide similar services for the financial community, airlines, and ISPs to name a few..." See details in "OASIS Member Companies Host SPML Identity Management Interoperability Event."
[July 09, 2003] "Microsoft Revives TrustBridge for Web Services Role." By John Fontana. In Network World (July 09, 2003). "Microsoft is dusting off its year-old and mostly forgotten TrustBridge technology and recasting it as middleware to support federation of identities across disparate platforms, company officials said Wednesday. Microsoft said at the annual Burton Group Catalyst Conference that TrustBridge will become a security server capable of producing a user authentication and authorization token in a variety of formats. It will also facilitate the sharing of that token across corporate boundaries. The server is a key part of Microsoft's effort to create an identity management framework that will work across disparate platforms. Just last week, the company unveiled its retooled meta-directory server, Microsoft Identity Integration Server, and released its Active Directory/Application Mode directory as part of its push into the identity management market... Despite Microsoft's newfound exuberance toward federated identity, it is still mum on how it might integrate with the Liberty Alliance, a group that is creating a framework for federated identity management... Microsoft plans to release TrustBridge around the time it ships the Longhorn version of the Windows operating system in 2005 or 2006. TrustBridge will require software on both the server and client side that supports WS-Security, an emerging Web services security protocol. Microsoft officials still would not confirm reports that they will have a server version of Longhorn. The company also would not say how TrustBridge would be packaged and licensed. TrustBridge was introduced more than a year ago as a bridge for sharing Kerberos authentication tickets among business partners. At the time, critics said the scope was too narrow, in that users had to have TrustBridge set up on both sides of the transaction and were required to use Kerberos. Microsoft said it has gone back to the drawing board with the advent of WS-Security and a number of supporting security protocols Microsoft has developed in conjunction with IBM, including WS-Policy and WS-Federation, which was announced this week... Another change in TrustBridge is the elimination of a proprietary technology Microsoft created to allow a message based on Simple Object Access Protocol (SOAP) to carry a security token in its header. The proprietary technology is being replaced by WS-Security, which allows security information to be attached to SOAP headers..." A news article by Michael Singer ("IBM-Microsoft Group Publishes WS-Federated Spec") quotes Microsoft director of Web services Marketing Steven Van Roekel as saying that "Microsoft is working on its own version of Liberty-like software running on top of the infrastructure. He said the platform would be a combination of its established Web services technologies like Passport, TrustBridge, and Windows Identity management software all working together under one yet-to-be-named umbrella brand." See "Web Services Security Specifications and TrustBridge" from MSDN Magazine (October 2002): "...In addition to this standards work, Microsoft is developing a set of technologies code-named 'TrustBridge,' which embrace WS-Security and other WS standards, and enable applications to use credentials created on a wide range of systems, including Active Directory, Microsoft .NET Passport, and other products that support WS-Security (for example, IBM middleware products). TrustBridge, which is planned to ship in 2003, will be able to federate with other enterprise or Internet-based authentication systems. This will make it possible for enterprises that choose to interoperate with the Passport system to allow their users to sign in with their existing credentials at Passport-participating sites. Microsoft development tools and the .NET Framework will add further support for these new interoperability standards so that developers can build applications that use WS-Security and related specifications from the WS-Security roadmap..." On WS-Federation: "Web Services Federation Language Provides Federated Identity Mapping Mechanisms."
[July 08, 2003] "Business Technology: Timing Is Everything In Making Jump To RFID." In InformationWeek (July 07, 2003). "'Now is the time. It's not too late. A year from now is too late.' The quotation comes from Mike DiYeso, chief operating officer of the Uniform Code Council, which 30 years ago began pushing universal product codes that have since become ubiquitous in things like scannable bar codes. DiYeso spoke those lines during a talk focusing on the rapidly emergent technology of RFID, or radio-frequency identification, at the recent Retail Systems Conference... I think it's just about impossible to overstate the business impact that RFID technologies could have if they're deployed thoughtfully and broadly. Efficiency, security, timeliness, paper reduction, accuracy, automation, visibility, tracking, load optimization, and collaboration are only the first of many areas that will be reshaped and probably dramatically improved via these tiny antenna-equipped chips. We first covered this new technology in September [2002] when researchers from the Auto-ID Center at MIT said that a combination of technological breakthroughs plus broader industry acceptance could push the unit price for these highly intelligent devices down to 5 cents or less..." Note: According to RFID Journal, "The Japanese government has allocated 950 to 956 MHz for RFID, paving the way for the global adoption of UHF tags for supply chain tracking. UHF is critical to the widespread adoption of RFID because it's the only frequency band that provides the extended read range needed to track goods in a supply chain setting. Most governments have already set aside 13.56 MHz for high-frequency RFID systems, which are suitable for applications where longer read ranges are not critical. But countries have not been able to harmonize the use of the UHF spectrum for RFID... The decision by the Japanese government is significant because it paves the way for a truly global system of tracking goods using UHF tags. The rest of Asia will likely follow in Japan's footsteps because Japan is such a powerful economic force in the region. China, in fact, is already looking to open up the UHF spectrum for RFID systems..." See: (1) "Microsoft Announces Commitment to Support Uniform Code Council In Commercialization of RFID Technology"; (2) "Physical Markup Language (PML) for Radio Frequency Identification (RFID)."
[July 08, 2003] "World Wide Web Consortium Process Document." Edited by Ian Jacobs (W3C). Produced by the W3C Advisory Board and reviewed by the W3C Members and Team. June 18, 2003. "The W3C Process Document describes the organizational structure of the W3C and the processes related to the responsibilities and functions they exercise to enable W3C to accomplish its mission. Much W3C work involves the standardization of Web technology. Working Groups carry out this work by building consensus around technical reports that become W3C Recommendations. This document describes processes for creating Recommendations, coordinating groups within W3C, and coordinating W3C work with that of other organizations. The Process Document formalizes the most important aspects of how W3C works to achieve such goals as: quality results, consensus about the results, responses to comments, fairness, and progress. Here is a general overview of what goes into developing a W3C Recommendation: (1) People generate interest in a particular topic (e.g., Web services). For instance, Members express interest in the form of Member Submissions, and the Team monitors work inside and outside of W3C for signs of interest. Also, W3C is likely to organize a workshop to bring people together to discuss topics that interest the W3C community. This was the case, for example, with Web services. (2) When there is enough interest in a topic (e.g., after a successful workshop and/or discussion on an Advisory Committee mailing list), the Director announces the development of a proposal for a new Activity or Working Group charter, depending on the breadth of the topic of interest. An Activity Proposal describes the scope, duration, and other characteristics of the intended work, and includes the charters of one or more Working Groups, Interest Groups, and possibly Coordination Groups to carry out the work... (3) There are four types of Working Group participants: the Chair, Member representatives, invited experts, and Team representatives. Team representatives both contribute to the technical work and help ensure the group's proper integration with the rest of W3C. The Working Group charter sets expectations about each group's deliverables (e.g., technical reports, test suites, and tutorials). (4) Working Groups generally create specifications and guidelines that undergo cycles of revision and review as they advance to W3C Recommendation status. The W3C process for producing these technical reports includes significant review by the Members and public, and requirements that the Working Group be able to show implementation and interoperability experience... The Process Document promotes the goals of quality and fairness in technical decisions by encouraging consensus, requiring reviews (by both Members and public) as part of the Recommendation Track, and through an appeal process for the Advisory Committee..." See the news item "W3C Process Document and Publication Rules Published": "The 18-June-2003 W3C Process Document is operative effective today... the document describes the structure and operations of the W3C. Among the changes are new document maturity levels, rules for amending Recommendations, and an enhanced liaison process for W3C work with partner organizations. The companion W3C Publication Rules have been updated and are public..."
[July 08, 2003] "Tools for the Code Generation." By Johanna Ambrosio. In Application Development Trends (July 01, 2003). "Model-Driven Architecture, or MDA, embodies the conundrum: Is the glass half full or is it half empty? Even though the MDA standard is still evolving, many products claim to be compliant with it and early adopters are developing apps with them. MDA vendors claim that today's products can generate between 40% and 80% of the completed code for a given app based on models created with UML, and customers and analysts back up those claims. MDA's purported benefits go beyond automatic code generation and the reduction of development costs, but those advantages are longer-term and most have yet to be proven outside of theoretical conversations. They include factors like eventual code and model reuse, and more effective fulfillment of user requirements. One advantage touted by the MDA camp is the ability to swap out underlying technologies -- OSs or languages, for example -- by simply revamping the platform-specific model and then regenerating the applications. Still, a split remains between current users of these products -- mostly architects who speak UML or another modeling language -- and the targeted group of developers who believe they can do a better job of writing apps than any code generator... Currently, there are at least 40 tools that incorporate at least one of the major aspects of MDA: UML-based modeling; transformation between the app's overall design models and the models that are specific to the underlying computing architecture (.NET, EJB and so on); and the generation of code in a specific language. Jon Siegel, the OMG's vice president of technology transfer, calls MDA adoption in the marketplace 'extremely enthusiastic,' and said there are another 'couple of dozen' products in the works. The OMG does not necessarily know about them all, he noted, because vendors need not license or pay to use OMG standards. Still, he said, MDA adoption has caught on faster than any other OMG standard... Bob Carasik, an enterprise architect at Wells Fargo in San Francisco, said that going with MDA is 'a matter of the project leadership making the decision to do so, and then making the tools available.' Wells Fargo is using MDA to convert interfaces written in CORBA to interfaces written in J2EE with XML messaging. Almost all the bank's apps use these interfaces, including mainframes, client apps that run various online apps and the corporate telephone response center. In the bank's case, in-house developers handcrafted the tools used to make the switch from CORBA middleware..." General references in "OMG Model Driven Architecture (MDA)."
[July 08, 2003] "CSS3 Basic User Interface Module." Edited by Tantek Çelik (Microsoft Corporation). W3C Working Draft 3-July-2003. Latest version URL: http://www.w3.org/TR/css3-ui. Last Call Working Draft produced by the W3C CSS Working Group. Public comments are invited through 31-July-2003. "CSS3 is a set of modules, divided up and profiled in order to simplify the specification, and to allow implementors the flexibility of supporting the particular modules appropriate for their implementations. This module describes selectors and CSS properties which enable authors to style user interface related states, element fragments, properties and values. Section 2.1 of CSS1 and Chapter 18 of CSS2 introduced several user interface related pseudo-classes, properties and values. Section 6.6.4 of Selectors also describes several additional user interface related pseudo-classes, and one pseudo-element. This working draft extends them to provide the ability, through CSS, to style elements based upon additional user interface states, to style fragments of user interface elements, and to alter the dynamic presentation of elements in ways previously only available through specific HTML4 elements and attributes... The working draft defines: (1) Pseudo-classes and pseudo-elements to style user interface states and element fragments respectively; (2) Additions to the user interface features in CSS Version 2.1; (3) The ability to style the appearance of various standard form elements in HTML4 and properties to augment or replace some remaining stylistic attributes in HTML Version 4; (4) Directional focus navigation properties; (5) A mechanism to allow the styling of elements as icons for accessibility... This document is a draft of one of the 'modules' for the upcoming CSS3 specification. It not only describes the user interface related properties and values that already exist in CSS1 and CSS2.1, but also proposes new properties and values for CSS3 as well. The Working Group doesn't expect that all implementations of CSS3 will implement all properties or values. Instead, there will probably be a small number of variants of CSS3, so-called 'profiles'..." General references in "W3C Cascading Style Sheets."
[July 08, 2003] "Microsoft, IBM Advance Web Services Specification. Security is Focus of Proposal." By Ed Scannell. In InfoWorld (July 08, 2003). "Building on their previous efforts to create a framework for producing secure and interoperable Web services, IBM, Microsoft, and several other leading software companies will announce a specification intended to help corporate users simplify identity management. The proposed WS-Federation specification features a set of Web services technologies intended to give developers a standard way of adding security capabilities to any Web service they build. The specification defines mechanisms that allow developers to manage and establish trust relationships across companies and domains using a variety of different types of security solutions, including support for federated identities, according to company officials. 'This will let companies tie their identity systems to each other in a way that lets them trade information back and forth about users and systems and then federate that data across the Internet no matter what security infrastructure they are using,' said Steven VanRoekel, Microsoft's director of Web services... IBM and Microsoft officials will be accepting feedback on the specification from across the breadth of the development community and expect to present the completed specification before industry groups deliberating on Web services such as the Web Services Interoperability (WS-I) and others 'in the next several months.' During a keynote at the Burton Group's Catalyst conference in San Francisco on Tuesday, IBM will demonstrate early implementations of interoperability across IBM and Microsoft systems using WS-Federation. Norsworthy said IBM expects to deliver products based on the specification 'towards the end of this year'... Microsoft will also show off early versions of the specification at this week's conference and will also deliver initial products that take advantage of the completed specification by the end of 2003, according to VanRoekel..." See details in the news item "Web Services Federation Language Provides Federated Identity Mapping Mechanisms."
[July 08, 2003] "ID Management Sparks Catalyst. Microsoft, IBM Tackle Directories, Privacy." By Brian Fonseca. In InfoWorld (July 07, 2003). "Identity management and the growing importance of directories in that paradigm will underpin new vendor products being unveiled at this week's Burton Group Catalyst Conference in San Francisco. As Microsoft kicks its ID management efforts into high gear, vendors including IBM, Novell, Waveset, Critical Path, Neoteris, and Sun Microsystems will unwrap at the Catalyst show auditing, password management, compliance, and secure identity portal initiatives... At Catalyst, IBM will unveil an enterprise privacy language and toolkits to help developers build privacy into applications to ease ID management, sources said. Big Blue plans to tackle provisioning through its release of IBM Tivoli Identity Manager 4.5 next month. The product will boast enhanced customization features to automate business processes within enterprise environments, sources said. Novell will make noise at Catalyst this week with Novell Nsure Audit -- which provides secure logging, Web-based access control, and auditing -- as the newest addition to the Nsure secure identity management line. An SDK will complement the release to plug-in third-party applications. The Provo, Utah-based company also will release a secure ID management road map and a SAML (Security Assertion Markup Language) extension for iChain. Meanwhile, Waveset will unveil Waveset Lighthouse Directory Master, a cross-platform administration portal for directories. The product offers a single interface for multiple directories to migrate and manage disparate identity data into a consolidated environment. Waveset will also team with Sun at Catalyst to produce an ID management offering for PeopleSoft apps..." See: (1) "Sun and Waveset Provide Identity Management Solution for PeopleSoft Using SPML"; (2) "Novell Releases Roadmap for Secure Identity Management Success. Novell Offers New Architectural Guide to Help Customers Understand and Logically Deploy the Essential Building Blocks of Secure Identity Management."
[July 08, 2003] "In the Service of Cooperation." By Kendall Grant Clark. From O'Reilly WebServices.xml.com (July 08, 2003). ['Kendall Grant Clark discusses BPEL4WS and its "competitors", WS-Choreography and DAML-S.'] "When considering what the top of the web services stack might eventually look like, it becomes clear that the higher one goes, the closer one gets to 'the user' or 'the business process' and the further away one gets from the network and the machine. But the idealized web services stack is not only describable in terms of spatializing metaphors (viz., height and depth) but also in terms of stratifying ones (viz., layers and levels)... we are entering a period during which the various layers of the top of the web services stack are starting to be shuffled into place. I date the inauguration of this period from the point at which BPEL (BPEL4WS) was moved into an OASIS TC for formal standardization... The comment most frequently made in the wake of OASIS taking BPEL into its fold was that BPEL had already won the battle in this period of high-level web services shakeout. But that comment is as wrong as it is glib. And it's wrong for a variety of reasons: first, it's not clear that there has to be a single winner, since this part of the web services stack is likely to be as layered and variegated as every other part of the stack; second, it's not clear that the existing efforts are competing directly. If one assumes, as most commentators have so far, that the market is unified and cohesive and that the dominant factor is institutional backing, then BPEL seems like a clear, easy winner. BPEL is a fairly mature specification, with at least two nontrivial implementations (IBM's BPWS4J and Collaxa's Orchestration Server), and its main backers (the IBM-BEA-Microsoft troika) jointly own a market segment not easily distinguished from the entire market itself. In other, blunter words: in the area of 'enterprise computing and application integration,' what IBM-BEA-Microsoft want, they very often get... One broadening assumption is that the top of the web services stack is likely to be a variegated space. Under this assumption, it becomes very unlikely that BPEL could 'win' in such a way as to crowd out every other technology. It is unlikely that BPEL could crowd out all others because it is unlikely that any single technology could adequately fill every niche of such a variegated space..." General references in "Business Process Execution Language for Web Services (BPEL4WS)."
[July 08, 2003] "Vox Populi: Web Services From the Grassroots." By Rich Salz. From O'Reilly WebServices.xml.com (July 08, 2003). "Last month, Sam Ruby threw the blogging world into a tizzy when he created a wiki to serve as the home for a new syndication format and protocol. This month we'll take a look at the project -- the working name is 'Necho' but has been 'Echo' and 'Pie' at various times. We'll use it to motivate a look at tradeoffs in XML and web services design. 'Syndication' is the term used when a site makes an RSS ('Really Simple Syndication') document available at a URL... Interest in RSS has been waxing, perhaps because the commercial possibilities are starting to occur to some folks. I doubt it was altruism that made Ruby's boss assign him to this project full-time, for example. The canonical web services example is a stock quote service, and translating that into an RSS feed that reports price updates is an obvious thing to do. The Necho content element has a type attribute to contain the MIME-compatible content-type. This is brilliant, as it allows Necho to smoothly integrate with work on adding attachments to SOAP. It's also multicultural, allowing the xml:lang attribute to specify the language being used. And, finally, multiple content elements act as a MIME multipart/alternative construct, allowing an RSS reader to find the representation it can best support... Is this technically better than RSS? It clearly is better. The ambiguities are gone, the metadata is more precise, the ability to provide rich and accurate content is now provided, and the use of XML is quite clean. Unlike RSS, it's feasible to define a schema for Necho. DTDs, XML Schema, and Relax NG are all in the works. In other words, validation won't require a special-built validator. News aggregators and other RSS consumers -- if they are written as XML applications -- should have an easier job of presenting more information to their users. Generating a Necho feed does not look to be that much harder than generating an RSS feed, only requiring the tweaking of a few output statements or templates. Creating a Necho-to-RSS stylesheet in XSLT should be fairly straightforward. So from the technical front, it looks like everyone will win. Is it politically and socially better? The jury is still out. Radical format changes rarely win converts... Just because the full web services machinery (WSDL, WXS, all those WS-xxx specs) rides atop SOAP, that doesn't mean that SOAP itself should be avoided. As we'll see next time, using SOAP as the messaging envelope enables all those features but doesn't require them. And along the way, we'll discover where REST becomes less useful..." General references in "RDF Site Summary (RSS)."
[July 08, 2003] "Managing the Web Services Flow. Confluent Core 3.0 Provides Centralized Control of Distributed Components." By Phillip J. Windley. In (July 03, 2003). "One of the chief differences between the decentralized computing model defined by Web services and distributed computing models of the past is the shift in component ownership. In distributed architectures, most of the interacting software components operated in a single trusted domain that was centrally managed. In the new decentralized model, interactions between components span organizational boundaries, making it difficult to manage, configure, monitor, and update the components from a single operations organization. Core 3.0 from Confluent Software is a Web services manager that tackles this problem by providing a single point of configuration for far-flung Web service components... The architecture of Confluent Core is as distributed as the services that it manages. Core works through a set of active intermediaries called 'gateways' or 'agents,' depending on how they are deployed. Gateways are proxy servers that intercept requests, enforce policies, and then forward requests to registered services. For a gateway to serve as the proxy for a service, clients must be directed at the gateway instead of the service. Agents, on the other hand, are policy-enforcement plug-ins that are deployed inside a SOAP container. Consequently, the client of the service is unaware of the message interception that is necessary to enforce policies... Core Analyzer aids in the collection and reporting of such data through the use of rules that reach inside the SOAP message body and apply Boolean expressions to the message parameters. Creating reports on these parameters is fairly easy. Even so, the rules are not a full-blown programming language, so analysis more complicated than filtering on Boolean conditions would require pulling the data out of the tool. In the same way that reports can be created based on the information inside the SOAP body, e-mail or SNMP alerts can be sent. Rules for reports and alerts can be triggered manually or scheduled for regular invocation. Message filtering is done offline on log files and thus does not place a load on the production system. Confluent Core provides a scaffolding for building reliable Web-services-based applications..."
[July 08, 2003] "Burton Group Adds Web-Services Group." By Charles Babcock. In InformationWeek (July 08, 2003). "Application Platform Strategies, a new business unit of the Burton Group, will focus on Web services... Web services are being elevated as a cure-all for companies' technology infrastructures. And while they have potential for simplifying some system interactions, they are not a panacea, says [Anne Manes,] the director of the new application-oriented analyst research group... Manes, a former analyst at Patricia Seybold Group, notes that the 'reference architecture,' or application development environment that the Burton Group recommends to a client, will vary, based on what's already present in a client's site. But small businesses' needs are very different from large ones, she adds. 'Most small businesses base themselves on Microsoft's product line,' while larger enterprises have a mix of IBM mainframes, Unix, and Java, along with Windows and the systems they have built themselves. As a result, different companies will work with different technology sets to get the applications and Web services that they need. In the future, developing applications to use Web services will mean giving management extensions to those apps so that they can yield information on their operations. Such extensions would supply data to Web-services management systems from companies such as Actional, Confluent, Flamenco Networks, Service Integrity, Swingtide, and Talking Blocks. In addition, management extensions will give 'visibility' into the message flow and transactions that are taking place Manes has been a participant in standards development at the Web services directory group UDDI.org (Universal Description, Discovery and Integration); the XML standards group, Oasis; WS-I, the Web Services Interoperability vendor consortium; and Sun Microsystems' Java Community Process. The Application Platform Strategies unit is being added to the Burton Group's two existing units: Directory and Security Strategies, and Network and Telecom Strategies. Jamie Lewis, CEO of the Burton Group, notes that the line between networks, directories, and applications is blurring as they become more interdependent. 'The networking people are increasingly in the application layer, doing voice over IP' and other applications, he notes. So research on applications is not necessarily isolated from the Burton Group's findings on networks and directories, he says. The new unit won't attempt to deal with issues related to packaged applications, such as PeopleSoft's or SAP's, Lewis says..." See details in the announcement: "New Burton Group Research Provides Practical Advice to Technologists Implementing Application Platform Standardization. New Application Platform Strategies Research & Advisory Service Focuses on the Evolution of Enterprise Application Architectures."
[July 08, 2003] "Debunking SAML Myths and Misunderstandings. Getting to Know the Security Assertion Markup Language." By Frank Cohen (Founder, PushToTest). From IBM developerWorks, XML zone. July 08, 2003. ['At the beginning of 2003, the OASIS group approved the Security Assertion Markup Language (SAML) specification. With 55 individuals from 25 companies participating, one would think SAML does everything and would be well understood. Instead, misconceptions about SAML exist in the software development community. In this article, Frank Cohen details and debunks many of the myths and misunderstandings surrounding SAML.'] "As a newcomer, the new Security Assertion Markup Language (SAML) specification is being compared to existing single-sign-on technology, authentication services, and directory services. SAML is the first of what will likely be many authentication protocols to leverage Web infrastructures, where XML data moves over HTTP protocols on TCP/IP networks. SAML was developed at the OASIS group as an XML-based framework for exchanging security information. SAML is different from other security approaches mostly due to its expression of security in the form of assertions about subjects. Other approaches use a central certificate authority to issue certificates that guarantee secure communication from one point to another within a network. With SAML, any point in the network can assert that it knows the identity of a user or piece of data. It is then up to the receiving application to accept if it trusts the assertion. Any SAML-compliant software can then assert its authentication of a user or data. This is important for the coming wave of business workflow Web service standards where secured data needs to move through several systems for a transaction to be completely processed... SAML is one of many attempts to reduce the cost of building and operating information systems that interoperate between many service providers. In today's competitive, fast-moving environment, federations of enterprises emerge that provide interoperability to the user through a browser and Web-enabled application. For example, a travel Web site allows users to book airline tickets and car rentals without having to sign on more than once. Today, hordes of software developers, QA technicians, and IT managers are required to handle complex and unreliable back-end systems that provide federated security between enterprises. In a typical Web-enabled infrastructure, software running leading-edge enterprise systems needs to handle browser redirects between authorization servers, HTTP post commands between server domains, public key infrastructure (PKI) encryption and digital certificates, and a mutually agreed-upon mechanism that states the trust level for any given user or group. SAML shows software developers how to represent users, identifies what data needs to be transferred, and defines the process for sending and receiving authorization data. ... With many security-focused companies already providing shipping products, SAML is off to a good start. The SAML specification provides a good framework for designing Web-enabled, single-sign-on services among a group of federated services. Work continues at the SAML specification working group to rationalize the interoperability requirements between SAML and other emerging standards, including WS-Security..."
[July 08, 2003] "ID Management Poised for Next Stage." By John Fontana. In Network World (July 07, 2003). "Existing identity management practices and standards in combination with Web services security protocols will provide needed protection to support distributed computing between corporations and their partners. That concept, and the ultimate benefits for corporate users, will be main themes at the annual Burton Group Catalyst Conference, which officials say could host 1,200 attendees this week in San Francisco. The conference also is expected to showcase vendor announcements of a number of identity management products, and OASIS will hold an interoperability test focused on Service Provisioning Markup Language (SPML) and announce ratification of the specification. The Catalyst conference, now on its 10th edition, has been at the forefront in espousing the benefits of directories and most recently the concept of a virtual enterprise network, in which network boundaries between companies are blurred. This year the focus is on identity management as a key to securing and managing the virtual enterprise network. Identity management is defined as a set of business processes and an infrastructure for the creation, maintenance and use of digital identifies under strict policies and legal constraints. A milestone in the evolution of the virtual enterprise network concept is coming up on corporate IT executives who believe that digital identities and identity-based security and policies are fundamental for the next era of distributed computing based on Web services. 'It's fair to say we have exploited the existing generation of Web-enabled identity infrastructure about as well as is possible,' says Jamie Lewis, president of Burton Group. That infrastructure consists of directories, Web access management products for single sign-on, provisioning, and delegated and self-service administration. ... The need for this broader scope of identity management is being driven by corporate executives who see the value in digital identity as they tune IT systems to comply with recent legislation such as the Sarbanes-Oxley Act, the Graham-Leach-Bliley Act and the Health Insurance and Portability and Accountability Act... The fringes of identity convergence can be seen today between identity management standards and Web services protocols of the future. But they also point to possible fragmentation. The Liberty Alliance, which is developing a federated user-identity framework, now has as its foundation the Security Assertion Markup Language (SAML), an XML-based protocol for exchanging security information. The next step in the process, Lewis says, is to converge current efforts with the Web services security protocol WS-Security. The Liberty/SAML combination already has embraced WS-Security, which is being developed at OASIS. SPML will become another important ingredient in an identity management framework..." See "OASIS Member Companies Host SPML Identity Management Interoperability Event."
[July 07, 2003] "RSS Killed the Infoglut Star. At Last, A Way to Manage the Internet Information Overload." By Chad Dickerson. In InfoWorld (July 03, 2003). "Something profound is happening with the delivery of information online with tools that leverage RSS (Really Simple Syndication or RDF Site Summary, depending on whom you ask)... When I started using an RSS newsreader daily, some remarkable things happened that I didn't necessarily expect: I began to spend almost no time surfing to keep up with current technology information, and I was suddenly able to manage a large body of incoming information with incredible efficiency. My newsreader has become so integral that it's now sitting in my Windows startup folder along with my e-mail client and contact manager. I'm humming 'RSS Killed the Infoglut Star' when I fire up my RSS newsreader in the morning. My enthusiasm for RSS is relatively new. As I write this, the Echo project is developing; regardless of what happens there, I hope the spirit of simplicity behind RSS survives. After working with RSS as a syndication format in past jobs at media companies, I finally jumped in with both feet as an RSS consumer a few months ago, and I've never looked back. On a very simple level, leveraging RSS means getting the information I want when I want it, and even the stuff I'm not interested in can be dispensed in record time. In an age of spam and cold calls, this is just what the information-overload doctor ordered. After working with RSS as a syndication format in past jobs at media companies, I finally jumped in with both feet as an RSS consumer a few months ago, and I've never looked back. When I started using an RSS newsreader daily, some remarkable things happened that I didn't necessarily expect: I began to spend almost no time surfing to keep up with current technology information, and I was suddenly able to manage a large body of incoming information with incredible efficiency... RSS feeds are really just simple XML documents, but this superficial simplicity can make some think that RSS can't possibly be that useful. One description I've heard hits the nail on the head: RSS newsreaders are TiVos for the Web..." General references in "RDF Site Summary (RSS)."
[July 07, 2003] "Access to Property Records is a Hit." By Dibya Sarkar. In Federal Computer Week (July 07, 2003). "By using Extensible Markup Language (XML) to integrate its legacy database of property records into a new Web-based application, Sacramento County's Clerk-Recorder division is providing a ROSI-er future for its customers. Through e-ROSI (Recorders Online Systems Index), lenders, banks, title companies and others who are trying to confirm the recording of documents can do so via the Internet. Usage of e-ROSI is accelerating. It was launched about a year ago and took about five months to receive its first 1 million hits, and then another three months to get the next 1 million, said Craig Kramer, assistant county clerk-recorder... 'What we have available on e-ROSI right now is the index for all recorded documents from 1965 to current -- somewhere in the neighborhood of 35 million records,' he said... The division provides computerized access to the grantee/grantor index within its office, but employees frequently received calls from financial institutions and others trying to find information. It was particularly difficult when callers didn't know the exact name or document number for the information they needed... 'Our attempt was to be able to put that general index out on the Internet free of charge so they could do their research at their leisure in their office and try to cut down the number of phone calls we would receive in our office,' Kramer said. That was important, because from 2000 to 2003 the volume of recording deeds rose 240 percent. He equated the service to a library locator record where users can search by the name of the grantor or grantee, given the year or decade, the type of document or a specific reference number... According to law, lenders have a limited time to file and confirm the recording of a deed of reconveyance -- a document for a paid-in-full mortgage. [Now] they can mail it in and then, within a few days, check online to see if it was recorded. In the past it would have taken longer... Kramer said the county finance department, which houses the Clerk-Recorder division, has subsequently unveiled a system to allow online access to property tax records..."
[July 07, 2003] "Web Services Are Poised To Change Business." By Tony Kontzer. In InformationWeek (July 01, 2003). ['While Web services are the odds-on favorite to help create business agility, IT departments aren't designed to change as quickly as the companies they support.'] "Web services are growing more important, but the business world has yet to take full advantage of their potential, said Chris Thomas, chief E-strategist for the Solutions Market Development Group at Intel. Thomas, who defines Web services simply as applications that interact with each other using Internet standards, said there is a huge opportunity in knowing how to design and use Web services to sell products. But first, he said, some evolution will have to occur. The current Web-services focus is on integration and transactions, but eventually the technology will be used for process automation and collaboration in a trusted computing environment. Thomas pointed to the nonprofit RosettaNet consortium, which has processed $5 billion worth of transactions in its effort to establish open standards for E-business processes, as an example of early Web-services success. He also noted that chemical provider Air Products and Chemicals Inc. has saved $1.5 million by using XML to eliminate one customer touch point in its transaction processes, and that it also has seen a 20% reduction in transactional errors. But Thomas said the real value of Web services will require a shift by IT departments to a model based on occasionally connected computers, or OCCs. He expects increasing mobility to spur a user interface evolution from HTML online forms to XML offline documents that won't require persistent connectivity to complete. Retailers such as Amazon.com Inc. and Lowes Food Stores Inc. already are working on making this happen, said Thomas. 'It's not an if, it's a when, as to whether you'll convert to an asynchronous model for mobile usage'..."
[July 07, 2003] "Grid Computing Edges Into the Mainstream." By Darryl K. Taft. In eWEEK (July 02, 2003). "The Globus Project Tuesday announced the official release of the Globus Toolkit 3.0 (GT3), which melds the worlds of grid computing and Web services and helps bring grid computing into more mainstream, commercial applications. Globus officials said GT3 features the first broad-scale implementation of the Open Grid Services Infrastructure (OGSI) 1.0. The Globus Project was instrumental in defining OGSI, which is part of the Global Grid Forum's Open Grid Services Architecture (OGSA). And while OGSA defines grid services, or Web services targeted to a specific thing, OGSI takes those services and sets forth core pieces of Web services functionality that Web services can use. From this, GT3 uses OGSI to deliver tools for monitoring resources; for Web services discovery, management and security; and for file transfer... Grid computing is the practice of using computers in the way utilities use power grids -- tapping the unused capacity of a vast array of linked systems. Users can then share compute power, databases and services online..." Note from the Globus Project announcement: "Today's official release of the Globus Toolkit 3.0 (GT3) is a milestone in the evolution of Grid computing, which lets people share computing power, databases, and other tools securely online across corporate, institutional, and geographic boundaries without sacrificing local autonomy. GT3 is the first full-scale implementation of the Open Grid Services Infrastructure (OGSI) version 1.0, a new specification that the Globus Project played a key role in defining. Previous versions of the Globus Toolkit have become central to hundreds of science and engineering projects on the Grid, and GT has been adopted for commercial offerings by major information technology companies. The toolkit's open-source software and services have transformed the way on-line resources are shared across organizations. OGSI is part of the Open Grid Services Architecture (OGSA) developed through the Global Grid Forum (GGF) to define Grid services, which are Web services that conform to a specific set of conventions. OGSI specifies a set of 'service primitives' that -- rather than stipulating precise services -- instead establish a nucleus of behavior common to all Grid/Web services that can be leveraged by meta- and system-level services. GT3 uses this specification to provide powerful tools for resource monitoring, discovery, management, security and file transfer..." See also the Globus Project website.
[July 07, 2003] "XML Publishing with Cocoon 2, Part 1." By David Cummings and Collin VanDyck. From O'Reilly ONJava.com (July 02, 2003). "Apache Cocoon is essentially an XML-publishing framework. That means that it facilitates the generation, transformation, and serialization (delivery) of XML content... Also, Cocoon is written in Java, giving it many benefits from the cross-platform and code manageability perspectives... Because Cocoon is built around a web context, the flow of control is centered around the familiar request-response web paradigm... Cocoon itself is built upon the Avalon Framework, which, to quote: 'defines relationships between commonly used application components, best-of-practice pattern enforcements, and several lightweight convenience implementations of the generic components.' Avalon makes it very easy to plug in custom components to other applications built upon the Avalon framework... You can think of a Cocoon pipeline as a series of ordered buckets into which certain requests may fall into and be processed. A pipeline consists of one or more matchers that basically return true or false, with the true state implying processing by the components that are encapsulated by the matcher. To illustrate, a request is made. This request has a URI. Cocoon loads the sitemap, looks at the pipeline, and starts comparing the URI against each matcher (bucket) from top to bottom. When it reaches a matcher that evaluates to true, flow of execution transfers to the code inside of that matcher XML element... the matchers themselves are components that you can plug in and swap out, if you want to match against something other than the request URI... Cocoon uses SAX (Simple API for XML) transforms to enable this three-step process. Originally, Cocoon was structured around the DOM (Document Object Model) format, which was slower and used more memory. SAX is significantly faster and enables us to very easily make subtle changes to the XML during the XML publishing process..."
[July 07, 2003] "Active Intermediaries Can Halt the Finger-Pointing. XML Data Flows and Active Intermediaries Fundamentally Alter the IT Political Landscape." By Jon Udell. In InfoWorld (July 03, 2003). "Most people agree that two unifying principles make Unix great: Everything looks like a file, including programs and devices, and everything can be pipelined by connecting the output of one thing to the input of another. These principles begat a philosophy: Make simple tools that do specific tasks well, and solve complex problems by assembling the simple tools like beads on a string. This worked well for decades, but now there's a major fly in the ointment. The data passing among the tools is not self-describing. If you use Grep to search a log file for entries on a given date and pipe the results to a report writer that grabs the third and fifth fields of each selected line of the report, there is no way for a filter inserted between the two to know that fields are separated by tabs or that the third field represents a discount and the fifth a total price... From 50,000 feet, it's fair to say that XML Web services do nothing more, and nothing less, than remove that fly from the ointment. The XML documents exchanged among Web services are self-describing. A discount field will be tagged as such. Any observer watching the data flow can pick out, examine, report on, or even modify the discount field with little (if any) special knowledge of the data. That intermediary might be a person running an XML-savvy tool such as InfoPath, or it might be a piece of software that intercepts and transforms the data. The nascent Web services industry has so far focused mainly on the technical implications of these 'active intermediaries.' They do make it vastly easier to integrate systems that pass around packets of self-describing data. But the reasons for this go beyond the regularity of XML data and the ubiquity of tools that can parse, search, and transform it. XML data flows fundamentally alter the political landscape of IT, shifting the locus of control away from the service endpoints and into the fabric of the network itself... Active intermediation is the ideal way to monitor and enforce a corporate or federal policy. To make that happen, the services we want to monitor and mediate must be able to speak XML. That's the easy part. The hard part will be trusting the intermediaries. We know how to do it: Use familiar PKI methods to secure channels and documents, and use newer techniques -- XML DSig (digital signature) and XML Encryption, for example -- to secure elements within documents. But the devil is still in the PKI details..."
[July 07, 2003] "[Easy Change of Address] One-Stop Shop." By Chee Sing Chan. In ComputerWorld Hong Kong (July 02, 2003). "The HKSAR government's numerous departments make its underlying structure complex and mind-boggling. Even the government admits that. 'Most citizens may not know exactly which department does what,' said Steven Mak, deputy director of Information Technology Services Department (ITSD)... given the government structure, building a 'one-stop shop service' will not merely be an IT project, but will involve many political and organizational issues. The Easy Change of Address (ECOA) service was re-launched in April removing the need for an e-cert, which was required in its first incarnation. The issues overcome by the government in delivering this inter-departmental service indicate the challenges involved... The simple ECOA service allows citizens to update their contact addresses to 12 government departments and two business organizations simultaneously from a single Web site. The response so far seems to be positive. According to Stone, 1,500 transactions were completed within the first month, compared to a monthly average of 100 in the previous service. 'It's pretty encouraging, considering the previous application was not very satisfactory,' he said. Interestingly, technology was not the major hurdle in delivering this service. According to Tony Ma, chief operating office of ESD Services Ltd, the government and commercial service portal manager, 'the technology involved in the [ECOA] service is relatively simple'. Ma's company created the service interface and the necessary eXtensible Markup Language (XML) messages are sent to each related government body. Upon receiving, 'it's up to each department to handle the XML message,' noted Ma. He added while some departments automate XML data processing, others continue to update that information within their system manually. That is what happens when an XML message reaches the Water Supplies Department. Lorrience Chow, the department's treasury accountant/system, explained the department downloads relevant information from the ESDLife portal daily, and manually updates it into its system... Being one of the first cross-departmental applications, ECOA is just the first step in the government's goal to develop more one-stop services. 'Our goal is to allow transactions and information flow between departments to be as easy as sending e-mail,' Mak stated..."
[July 07, 2003] "Bill Seeks Free Access to Federally Funded Research. Would Remove Copyright Protection from Publicly Funded Material." By Grant Gross. In InfoWorld (July 02, 2003). "Scientific research paid for by the U.S. government would be required to be given free to the public, under a bill introduced in Congress last week. Representative Martin Olav Sabo, a Minnesota Democrat, said he introduced the Public Access to Science Act (PASA) of 2003 because U.S. residents shouldn't have to pay twice -- once with tax dollars and a second time with subscription fees to scientific journals -- for research that improve their health or save their lives... Sabo's bill, introduced on June 26 and supported by a group called the Public Library of Science, would amend U.S. copyright law so that copyright protection isn't available for research 'substantially funded' by the U.S. government. Without copyright protection, such research would be freely available in the public domain. The U.S. federal government spends about $45 billion a year on scientific and medical research, Sabo noted, and the Internet could allow instant access to the scientific research supported by the U.S. government. 'It defies logic to collectively pay for our medical research, only to privatize its profitability and availability,' Sabo added in his statement. The Public Library of Science, based in San Francisco, California, is planning to launch a new online scientific journal, called PLoS Biology, in October [2003]. The group's goal is to create a new model for scientific publishing that will make all scientific biomedical publications freely available online, according to information on the group's Web site. The group argues that most biological and medical research publications are now available over the Internet, but full access is limited to people who subscribe and can afford to pay the 'often exorbitant subscription fees,' according to the Web site. 'Journal publishers often pocket excessive profits, while most of the public whose tax dollars went toward funding the research, and most of the scientists who did the research are unable to access the written results of the research'..."
[July 07, 2003] "Self-Enhancing Stylesheets." By Manfred Knobloch. From XML.com (July 02, 2003). "Developing new stylesheets can be a chore. It would be nice if you could tell your stylesheet to trace which tags from the source document are not yet processed by xsl:template elements. And why not make your stylesheet write an xsl:template match skeleton for each unhandled tag? Unfortunately, doing this was too hard with XSLT 1.0. But XSLT 2.0 will change this, and with help of Saxon 7.5 (or greater) you can try it out now. XSLT gives you two ways of processing XML documents. The first is to directly access parts of the document by XPath expressions. This is what the XSLT 2.0 Working Draft calls pull-processing. The other way is to walk through the document in document order. Letting the document structure drive the processing sequence is called push processing, and this is what the xsl:template match and xsl:apply-templates mechanisms are for. Usually both kinds of processing are mixed in a stylesheet. When one writes a new stylesheet to process an unknown document, coding typically begins with adding xsl:template match rules for the tags... The step-by-step way of writing your templates is not a problem unless you have to work on large or deeply structured documents, containing many different tags... The idea of letting the stylesheet write the names of the unhandled tags into a separate document is an obvious step: we make it write out not just comments, but <xsl:template match...> fragments that match the bypassed tags, and inform us about all of their attributes. And we want to have this code as a stylesheet module that can easily be plugged into any stylesheet we are currently working on. It is very hard to achieve all this with XSLT 1.0. At a minimum you will have to use processor specific extensions. For that reason, the following solution requires a XSLT 2.0 Processor... [the] solution [developed here] keeps us informed about the tags of an unknown document, especially when there is no DTD available. And it shows some of the promising features of XSLT 2.0..."
[July 03, 2003] "BPEL: Make Your Services Flow. Composing Web Services into Business Flow." By Doron Sherman (CTO, Collaxa, Inc). In Web Services Journal Volume 3, Issue 7 (July 2003), pages 16-21. With 5 figures. "The current phase of the evolving Web services stack is orchestration; i.e., coordinating interactions among published Web services and composing them into long-running flows. Orchestration is comprised of three pillars: asynchronous conversations, flow coordination, and exception management. In support of these pillars, and building on its foundation, the Web services stack adds: (1) WS-ReliableMessaging: to guarantee once and only-once delivery of messages; (2) WS-Addressing: to define correlation semantics to properly match requests and replies in the context of asynchronous messaging; compensation semantics for undoing of actions in the case of faults, as commanded by application logic; (3) BPEL4WS: an execution language for defining service composition and coordinating interactions into business flows... Support for asynchrony is essential for enabling 'business quality' Web services that need to take part in integration scenarios. Asynchrony is also mandatory for allowing optimal use of 'business time' (e.g., allowing for user intervention within the course of an executing business flow or deferred batch processing for better distribution of processing load). Asynchrony improves scalability by decoupling requests for service from their corresponding responses, thereby avoiding a cascade of execution bottlenecks from spreading throughout the application. The formalism for asynchronous conversations includes WS-Addressing, WS-ReliableMessaging, and BPEL Service Link. WS-Addressing specifies correlation and callback information. WS-Addressing provides transport-neutral mechanisms for addressing Web services and messages by defining XML elements to identify Web service endpoints and to secure end-to-end endpoint identification in transmitted messages. WS-ReliableMessaging allows messages to be delivered reliably between interacting Web services in the presence of software component, system, or network failures. Its primary goal is to create a modular mechanism for reliable message delivery. BPEL Service Link defines the callback interface... Flow coordination is comprised of the WSDL interface, XML variables, partners, flow activities, and compensation handlers. BPEL4WS relies heavily on WSDL descriptions of the services involved in order to refer to exchanged messages, the operations being invoked, and the portTypes they belong to... According to some analysts, nearly 80% of the programming effort in automating business processes is spent in exception management. In BPEL, a local fault handler associated with the Web service, is invoked subsequent to the signaled exception. Exception management also includes WS-Coordination and WS-Transaction. These specifications provide cancellation requests across a network of services to ensure coordination of interacting services in case of failures, as well as guaranteeing the integrity of the overall execution... The BPEL server supports orchestration logic provided in BPEL form for execution of business flows. Although not a mandatory requirement, a BPEL server runtime can utilize a J2EE application server for its underlying execution environment (rather than reinventing the wheel with respect to multithreading, database connection pooling, etc.). It also provides native J2EE integration by leveraging the J2EE application server runtime environment. To ensure reliability of long-running business flows involving asynchronous conversations with Web services and loosely-coupled business transactions, it uses context dehydration for executing flows. The dehydration mechanism uses a persistent store, such as a relational database, to safely store, and subsequently retrieve, flow instances... BPEL, a standard process flow grammar, provides a new foundation for integration. It empowers developers to tie transactional services, events, and user tasks into easy-to-adapt business flows. This new genre of process-driven applications moves enterprises closer to the realization of the agile real-time enterprise." General references in "Business Process Execution Language for Web Services (BPEL4WS)." [alt URL]
[July 03, 2003] "Using BPEL: What IT Managers Need to Know." By Derick Townsend (COO, OpenStorm Software), Ryan Cairns (CTO, OpenStorm Software), and Christoph Schittko (Momentum Software). In Web Services Journal Volume 3, Issue 7 (July 2003), pages 26-28. "Web services technology is rapidly evolving to meet the complex needs of the enterprise customer. The ability to integrate and assemble individual Web services into standards-based business processes is an important element of the service-oriented enterprise and the overall Web service technology 'stack.' These loosely coupled business processes, commonly referred to as orchestrated Web services, will be designed, integrated, executed, and managed similar to how proprietary enterprise application integration (EAI) and Business Process Management (BPM) tools operate today. However, business process execution standards and Web services will greatly reduce vendor lock-in to dramatically reduce costs and provide broader interoperability benefits... To address these needs, the Business Process Execution Language for Web services (BPEL4WS or BPEL) has quickly become the dominant specification to standardize integration logic and process automation between Web services... The most important case for BPEL is that proprietary EAI and BPM solutions are just too expensive. They are expensive to develop, maintain, and extend across a diverse, heterogeneous environment. Proprietary integration links are often brittle, and the cost to maintain them as organizations continually evolve is a significant burden. The specialized skills required to support these proprietary solutions often create their own cost and availability concerns. The frequent result is that constrained IT budgets end up shifting the majority of their funds toward maintenance issues, with precious little left over to satisfy the needs for innovation and improved flexibility... Within the corporate firewall, BPEL has the potential to standardize application-to-application integration and extend integration to previously isolated systems. As a result of years of proprietary integration efforts, a variety of integration tools and solutions exist in the enterprise today. This remains true in organizations that adopt high-end EAI products, as the cost-benefit analysis of some integration needs cannot justify the use of custom EAI adapters. In contrast, BPEL holds promise as a 'lowest common denominator' integration technology that delivers a ubiquitous, platform-neutral solution for lower cost. Outside the firewall, BPEL can enable a whole new level of corporate agility as it relates to integrating and switching external vendors and services. By using BPEL to define business processes, companies are empowered to select best-of-breed processes and services to incorporate into their operations ... BPEL remains an emerging technology, with challenges awaiting those interested in near-term deployment. Fortunately, the initial vacuum of BPEL-based development tools has been filled. Many software vendors have recognized the considerable market opportunity and responded quickly with solutions. Vendors like IBM, Collaxa, and OpenStorm offer BPEL-compliant orchestration engines, and a variety of design and development tools have been announced by industry leaders such as Microsoft and BEA." General references in "Business Process Execution Language for Web Services (BPEL4WS)." [alt URL]
[July 03, 2003] "Web Services Orchestration and Choreography. A Look at WSCI and BPEL4WS." By Chris Peltz (HP Developer Resources Organization). In Web Services Journal Volume 3, Issue 7 (July 2003), pages 30-35. With 4 figures. "The two standards discussed here -- the Web Service Choreography Interface (WSCI) and Business Process Execution Language for Web Services (BPEL4WS) -- are designed to reduce the inherent complexity of connecting Web services together. Without them, an organization is left to build proprietary business protocols that shortchange true Web services collaboration. Recently, the terms orchestration and choreography have been employed to describe this collaboration: Orchestration refers to an executable business process that may interact with both internal and external Web services. Orchestration describes how Web services can interact at the message level, including the business logic and execution order of the interactions. These interactions may span applications and/or organizations, and result in a long-lived, transactional process. With orchestration, the process is always controlled from the perspective of one of the business parties. Choreography is more collaborative in nature, where each party involved in the process describes the part they play in the interaction. Choreography tracks the sequence of messages that may involve multiple parties and multiple sources. It is associated with the public message exchanges that occur between multiple Web services. Orchestration differs from choreography in that it describes a process flow between services, controlled by a single party. More collaborative in nature, choreography tracks the sequence of messages involving multiple parties, where no one party truly 'owns' the conversation. In this article, I'll highlight key technical requirements for Web services orchestration and choreography, and point out key standards used to meet these needs... While BPEL4WS supports the notion of 'abstract processes,' most of its focus is aimed at BPEL4WS executable processes. BPEL4WS takes more of an 'inside-out' perspective, describing an executable process from the perspective of one of the partners. WSCI takes more of a collaborative and choreographed approach, requiring each participant in the message exchange to define a WSCI interface. At the same time, WSCI and BPEL4WS both meet many of the technical requirements outlined earlier. They both provide strong support for persistence and correlation to manage conversations. WSCI and BPEL4WS also describe how exceptions and transactions should be managed. From a usability standpoint, WSCI does have a somewhat 'cleaner' interface than BPEL4WS. Some of the difficulties in using BPEL4WS are attributed to the fact that the language includes artifacts from both XLANG and WSFL, each of which took a different approach to workflow... Orchestration and choreography are terms related to connecting Web services in a collaborative fashion. The capabilities offered by the available standards will be vital for building dynamic, flexible processes. The goal is to provide a set of open, standards-based protocols for designing and executing these interactions involving multiple Web services. Many vendors have announced support for BPEL4WS in their products, and the OASIS technical committee is looking to move this specification going forward. WSCI is being considered by the W3C for Web services choreography. While BPEL4WS has defined a notion of choreography through abstract processes, it is still unclear whether this will be accepted over the W3C work. Clearly, market adoption will be driven by the direction taken by vendors and their support of the standards in their product implementations. As these standards take shape, it will be important to pay close attention to the direction taken by standards bodies such as the W3C and OASIS..." See: (1) "Web Service Choreography Interface (WSCI)"; (2) "Business Process Execution Language for Web Services (BPEL4WS)." [alt URL]
[July 03, 2003] "Twenty-First Century Business Architecture. The Future is Here." By Howard Smith (Europe CTO, Computer Sciences Corporation; Co-chair, Business Process Management Initiative) and Peter Fingar (Executive Partner, the Greystone Group). In Web Services Journal Volume 3, Issue 7 (July 2003), pages 10-14. WSJ Special Issue on BPM; see the issue editorial by Sean Rhody. "While the vision of process management is not new, existing theories and systems have not been able to cope with the reality of business processes -- until now. By placing business processes on center stage, as first class citizens in computing, corporations can gain the capabilities they need to innovate, reenergize performance, and deliver the value today's markets demand. Business process management (BPM) systems discover what you do, and then manage the life cycle of improvement and optimization in a way that translates directly to operation. They see the world in terms of processes using notations and representations business people intuitively understand, and reflect the nature of the way business has to be -- connected, collaborative, asynchronous, coordinated, conversational, and constantly changing... Many in the IT industry perceive BPM only as a better, faster, cheaper way to integrate applications, and this view is exacerbated by the focus on languages used to support Web services orchestration, such as BPEL. For all that is written about such languages you would think that BPM is only about systems interoperability, application integration, and a smart new way to develop more software. This thinking totally misses the point. BPM is about better, faster, cheaper business processes, not better, faster, cheaper technology. BPM technologies provide direct representation of business processes, and then open those processes to complete life-cycle management: from discovery to design, deployment, execution, operations, analysis, and optimization... In short, integration technology, however wrapped in process clothing, solves only an integration need. This is not to say that integration products cannot evolve to become BPM products, or that BPM products cannot provide integration, but the distinction needs to be made. What distinguishes BPMS is its central focus on the direct representation and manipulation of business processes, just as RDBMS provides the representation and manipulation of business data and the spreadsheet provides the representation and manipulation of numerical data... Process languages, such as the vendor-sponsored Business Process Execution Language for Web Services (BPEL4WS), will converge and evolve towards the needs of a BPMS with a solid mathematical underpinning. Today, BPEL is primarily advocated for loosely coupled application integration and development, but as the needs for BPM go well beyond Web services and simple workflow requirements, BPEL will require the same theoretical foundation. CIOs will rightly disregard any other simplistic BPM 'layer' as 'yet another point solution' unless BPM systems can be shown that they embody a strong formal model of enterprise computing and mobile processes. Only then can BPMS migrate from a niche category to a mainstream platform, similar to what companies already know and understand in other areas of IT support such as relational data management and network management. BPM is far more than another way to develop applications. BPMS is a platform that will support a raft of new processes, tools, and applications. A sales campaign isn't a software application -- it's an application of process management... As CAD/CAM systems enabled computer-integrated and 'just-in-time' manufacturing, BPM can facilitate collaborative 'just-in-time' business processes and a new era of process manufacturing. Those players in the IT industry that master BPM will share the new wealth with their customers: productivity gains, innovation, and lowered costs like those the industrial design and manufacturing industries have already realized as a result of implementing a direct path from design to execution..." See: (1) "Business Process Modeling Language (BPML)"; (2) "Business Process Execution Language for Web Services (BPEL4WS)."[alt URL]
[July 03, 2003] "Introducing WS-Transaction Part II. Using Business Activities." By Dr. Jim Webber and Dr. Mark Little (Arjuna Technologies Limited). In Web Services Journal Volume 3, Issue 7 (July 2003), pages 6-9. "In July 2002, BEA, IBM, and Microsoft released a trio of specifications designed to support business transactions over Web services. BPEL4WS, WS-Transaction, and WS-Coordination together form the bedrock for reliably choreographing Web services-based applications. In our previous articles we introduced WS-Coordination, a generic coordination framework for Web services, and showed how the WS-Coordination protocol can be augmented to provide atomic transactionality for Web services via the WS-Transaction Atomic Transaction model. This article looks at support for extended transactions across Web services. We also show how these can be used to provide the basis for higher-level business process management and workflow technology... A business activity (BA) is designed specifically for these long-duration interactions, where exclusively locking resources is impossible or impractical. In this model, services are requested to do work, and where those services have the ability to undo any work, they inform the BA so that if the BA later decides to cancel the work (i.e., if the business activity suffers a failure), it can instruct the service to execute its undo behavior. The key point for business activities is that how services do their work and provide compensation mechanisms is not the domain of the WS-Transaction specification, but an implementation decision for the service provider... The business activity model has multiple protocols: BusinessAgreement and BusinessAgreementWithComplete. However, unlike the AT protocol, which is driven from the coordinator down to participants, this protocol is driven from the participants upwards...The crux of the BA model, compared to the AT model, is that it allows the participation of services that cannot or will not lock resources for extended periods. While the full ACID semantics are not maintained by a BA, consistency can be maintained through compensation, although writing correct compensating actions (and thus overall system consistency) is delegated to the developers of the services controlled by the BA. Such compensations may use backward error recovery, but typically employ forward recovery... The BPEL4WS specification suggests WS-Transaction Business Activity as the protocol of choice for managing transactions that support the interactions of process instances running within different enterprise systems. A business activity is used both as the means of grouping distributed activities into a single logical unit of work and the dissemination of the outcome of that unit of work -- whether all scopes completed successfully or need to be compensated... The OASIS Business Transactions Protocol (BTP) was developed by a consortium of companies, including Hewlett-Packard, Oracle, and BEA, to tackle a similar problem to WS-Transaction: business-to-business transactions in loosely coupled domains. BTP was designed with loose coupling of services in mind and integration with existing enterprise transaction systems was not a high priority... Although at least in theory WS-Transaction and BTP are intended to address the same problem domain, there are significant differences between them. BTP allows business-level negotiation to occur during many points in the protocol in its Qualifier mechanism; WS-Transaction does not have such a capability. [So] we've seen both the atomic AT protocol and the non-ACID BA designed to support long-running transactions. While both the AT and BA models will be available to Web services developers directly through toolkits, it is the BA model that is supported by the BPEL4WS standard to provide distributed transaction support for business processes..." See also: (1) "Introducing WS-Transaction, Part 1. The Basis of the WS-Transaction Protocol."; (2) "Introducing WS-Coordination." [alt URL]
[July 02, 2003] "Microsoft Revamps ID Management Offering. Windows Server Updated." By Brian Fonseca and Paul Roberts. In InfoWorld (July 02, 2003). "Microsoft finally placed a definitive stake in the lucrative identity management market Wednesday by unveiling a revamped version of its Meta Directory product, dubbed Microsoft Identity Integration Server 2003. In conjunction with its new offering, Microsoft also introduced 'Identity and Access Management Solution Accelerator,' a new set of prescriptive guidelines created with Pricewaterhouse Coopers LLP to help customers build and test identity management infrastructures. Partnerships with Oblix and OpenNetworks Technologies will help extend Microsoft's reach to cross-platform levels as well. Microsoft Identity Integration Server (MIIS) improves upon the software giant's Meta Directory Server through the addition of new features including automated account provisioning, the synchronization of identity information, and Web-based self-service password management capabilities... For customers desiring a directory service to provide application specific information toward applications developed in-house, Microsoft announced its new Active Directory in Application Mode (ADAM). Stephenson said ADAM will allow customers to deploy Active Directory as a LDAP directory service for application-specific data while using their distributed Active Directory infrastructure for single-sign-on. Also, Windows Server 2003 will include the Identity Integration Feature Pack for Windows Server Directory, and Directory Services Mark-up Language version 2.0 (DSML v.2). The added capabilities will let developers represent directory structural information and directory operations as XML documents..." See details and references in "Microsoft Announces Release of Microsoft Identity Integration Server (MIIS) 2003."
[July 02, 2003] "Microsoft Throws Hat Into ID Management Ring." By Dennis Fisher. In eWEEK (July 01, 2003). "Microsoft Corporation is set to "announce its new Identity Integration Server 2003, a stand-alone identity management server meant for large enterprise deployments. The new offering is Microsoft's first real jump into the ID management market and comes at a time when a gaggle of vendors are all fighting for sales and name recognition in the emerging space. The key feature of MIIS will be its ability to allow Active Directory to communicate with other LDAP-enabled directories, such as Sun Microsystems Inc.'s Sun ONE Identity Server and Novell Inc.'s eDirectory. MIIS is essentially a major overhaul of what was previously known as Microsoft MetaDirectory Services and will use Microsoft's own SQL Server as the database for user information. Until now, Microsoft has relied on the prevalence of Windows to drive support for Active Directory and has shown little interest in tying into other vendors' directories. But customer demand for interoperability between directories has given rise to the need for MIIS. Like most other ID management solutions, MIIS is expected to give administrators a single point from which to manage all of their users' identity information and a way to quickly push changes to a wide variety of applications. Microsoft may also include a version of its TrustBridge technology, which was introduced last year as a way for companies using Active Directory to exchange user identity data... OpenNetwork Technologies will announce that its Universal Identity Platform supports MIIS and will serve as the Web single sign-on solution for the new server. OpenNetwork's platform is built on .Net technology and includes support for Active Directory in Application Mode, a new capability in Windows Server 2003. ADAM, as it's known, allows administrators to store directory information that is only applicable to one application in a local directory store. This data can be modified without changing the main corporate directory because ADAM runs as an independent service and not as an operating system service... The Universal Identity Platform also includes support for all of the Windows 2003 servers as well as SAML and the new Service Provisioning Markup Language (SPML), company officials said..." See details and references in "Microsoft Announces Release of Microsoft Identity Integration Server (MIIS) 2003."
[July 02, 2003] "Microsoft Overhauls Directories for ID Management." By Kevin Murphy. In Computer Business Review Online (July 02, 2003). "Microsoft Corp is putting a renewed focus on identity management and simplicity in the latest versions of two directory server products, which will be announced today [2003-07-02], according to several Microsoft partners. The company will today start shipping a simplified Active Directory deployment option, as well as a renamed and feature-expanded edition of Microsoft Metadata Services (MMS), both designed for Windows Server 2003. Microsoft is expected to change the name of MMS 2003 to Microsoft Identity Integration Server 2003. It will have a broader range of support for non-Microsoft directories, partners said, through a larger collection of connectors... The server will use its own SQL Server as its data store, replacing the Zoomit repository, and will make use of XML standards for the first time. MIIS will support LDAP-compatible directories including those from IBM, Novell, and Sun. MMS, in recognition of the fact that companies have diverse directories, was always designed to synchronize data between these directories. In the new version it will also have what partners characterized as rudimentary user provisioning features. User identity management vendors OpenNetwork, Business Layers Inc and Oblix Inc are among about 10 companies that will express support for MIIS in varying degrees today. These firms have built applications that connect to MIIS for single sign-on and provisioning. Oblix said its software will add onto MIIS to allow SSO and identity management to be scaled to millions of users. The two companies have built such a system to manage seven million users, senior director of technology alliances Beth Dabagian said... A second announcement from Microsoft today will focus on the delivery of Active Directory Application Mode (AD/AM or ADAM), which is a version of Microsoft's previously problematic directory that has been "decoupled" from Windows. The Active Directory available as part of Windows 2000 came in for criticisms as being complex and too restrictive. The software was designed to help companies centralize their user directories, but the work involved in doing so was difficult..." See details and references in "Microsoft Announces Release of Microsoft Identity Integration Server (MIIS) 2003."
[July 01, 2003] "XML Network Management Interface." By Weijing Chen and Keith Allen (SBC Labs). IETF Netconf Working Group, Internet Draft. Reference: 'draft-weijing-netconf-interface-00'. June 2003, expires December 2003. 23 pages. "This document describes XML network management interface between network managed system and network managing system. The XML network management interface is intended for use in diverse network environment where transport and data model requirements vary greatly. It is unlikely that a single transport and data model specification will meet the needs of all anticipated service operators. Therefore, the XML network management interface is partitioned into layered components. The protocol transport is divided into two components: abstract WSDL and concrete WSDL [BEEP, SOAP/HTTP, SSH]. WSDL (Web Service Description Language) is a formal specification language for XML-based Web service. Abstract WSDL describes the message interaction such as request and response. Concrete WSDL describes the actual transport protocols mapping. The protocol operations are specified in XML schema, which describes the operations message construct and is referred by transport WDSL. The capabilities of optional function are described in capabilities XML schema. It allows peers to exchange the actually functionality implemented in other end. The management data models are described in XML schema too. The dividing of interface allows the formal verification of interface message against interface specification using available general purpose XML toolkits [such] as the following [see figure]... The interface XML message is validated against the protocol operation XML schema and specific data model XML schema using general purpose XML schema validator. If systems supports XPath and the interface XML message contains XPath element, the validated XML message is parsed and validated against live XML instances dataset in the systems using XPath parser. Thereafter the processed XML message is passed onto the management application for further processing...The protocol operations consist of three basic operations: perform, abort, and notification. Each operation has request and response message correlated through unique message id..."
[July 01, 2003] "Systinet Boosts Secure Web Services. WASP Server 4.6 is Released." By Paul Krill. In InfoWorld (June 26, 2003). "Systinet has announced general availablity of WASP Server for Java 4.6, the company's Web services infrastructure platform, featuring improved security. The product provides a platform for developing, running, and managing Web services applications. The company further describes WASP as a solution for IT professionals and ISVs needing a lightweight, embeddable Web services runtime for Java or J2EE applications. Version 4.6 adds backing for the OASIS (Organization for the Advancement of Structured Information Standards) WS-Security specification and is compatible with all implementations of WS-Security, according to Systinet. WS-Security enables users to protect the contents of a Web services or XML message... Also included in the new version is enhanced support for asynchronous Web services, support for the newly released SOAP (Simple Object Access Protocol) 1.2 specification, and backing for Java Web services APIs, including JAX-RPC (Java API for XML-Based Remote Procedure Calls), JAXM (Java API for XML Messaging), and SAAJ (SOAP with Attachments API for Java)..." See: (1) details in the announcement: "Systinet Announces WASP Server for Java, 4.6 With Full Support For WS-Security. WASP Provides Industry's Best Web Services Security, Performance and Interoperability."; (2) "Web Services Security Specification (WS-Security)."
[July 01, 2003] "Software AG Enhances XML Document Storage. Searching Improved in Tamino." By Paul Krill. In InfoWorld (July 01, 2003). "Software AG is announcing Version 4.1.4 of Tamino XML Server, featuring better search tools and an API for the C programming language. Tamino XML Server is a platform for storing, publishing, and exchanging XML documents, Software AG said. Businesses can exchange documents and data among suppliers, customers, and partners. Version 4.1.4 enables metadata searches of non-XML documents via the Tamino Non-XML Indexer, which can search on documents such as those in Microsoft Office or Sun StarOffice... Non-XML Indexer is a plug-in module for Version 4.1.4 that works with Tamino XML Server to extend the set of criteria for searchable metadata, such as author, creation date, date last modified, or file size. This enables faster, more intelligent searches of non-XML documents. Indices can be created for standard document formats such as Microsoft Office. The API for C allows client applications written in C or C++ to access Tamino XML Server without going through a Web server, according to Software AG..." See details in the announcement: "Software AG's Tamino XML Server Offers Enhanced Support for Non-XML Documents. Tamino XML Server Version 4.1.4 Provides Indexer to Better Search Metadata for Microsoft Office and Sun StarOffice Documents."
[July 01, 2003] "XML Query (XQuery) Requirements." Edited by Don Chamberlin (IBM Almaden Research Center), Peter Fankhauser (Infonyte GmbH) Massimo Marchiori (W3C/MIT/University of Venice), and Jonathan Robie (DataDirect). W3C Working Draft 27-June-2003. Latest version URL: http://www.w3.org/TR/xquery-requirements. Produced for the W3C XML Query Working Group. "This document specifies goals, requirements, and usage scenarios for the W3C XML Query (XQuery) data model, algebra, and query language. It also includes, for each requirement, a corresponding status, indicating the current situation of the requirement in the XML Query family of specifications... The usage scenarios describe how XML queries may be used in various environments, and represent a wide range of activities and needs that are representative of the problem space to be addressed. They are intended to be used as design cases during the development of XML Query, and should be reviewed when critical decisions are made..." This version of the Requirements document "fixes a bug in 3.2.1 Query Language Syntax of http://www.w3.org/TR/2003/WD-xquery-requirements-20030502, which repeated one requirement twice and deleted another, and modifies the status section. Otherwise, it is the same as in the previous version: it includes, for each requirement, a corresponding status, indicating the current situation of the requirement in the XML Query family of specifications, at the beginning of the Last Call period for the XQuery 1.0 and XPath 2.0 Data Model and for the XQuery 1.0 and XPath 2.0 Functions and Operators. A future revision will be provided when all remaining open issues have been resolved and when the remaining documents are issued as Last Call working drafts..." General references in "XML and Query Languages."
[July 01, 2003] "Document Object Model (DOM) Level 3 Load and Save Specification Version 1.0." Edited by Johnny Stenback (Netscape) and Andy Heninger (IBM). W3C Working Draft 19-June-2003. Latest version URL: http://www.w3.org/TR/DOM-Level-3-LS. Also available in PDF format. ['The W3C DOM Working Group has released a Last Call Working Draft of the DOM Level 3 Load and Save Specification. Comments are welcome through 31-July-2003'.] "This specification defines the Document Object Model Load and Save Level 3, a platform- and language-neutral interface that allows programs and scripts to dynamically load the content of an XML document into a DOM document and serialize a DOM document into an XML document... W3C's Document Object Model (DOM) is a standard Application Programming Interface (API) to the structure of documents; it aims to make it easy for programmers to access components and to delete, add, or edit their content, attributes and style. In essence, the DOM makes it possible for programmers to write applications which work properly on all browsers and servers and on all platforms. While programmers may need to use different programming languages, they do not need to change their programming model..."
Earlier XML Articles
- XML Articles and Papers June 2003
- XML Articles and Papers May 2003
- XML Articles and Papers April 2003
- XML Articles and Papers March 2003
- XML Articles and Papers February 2003
- XML Articles and Papers January 2003
- XML Articles and Papers December 2002
- XML Articles and Papers November 2002
- XML Articles and Papers October 2002
- XML Articles and Papers September 2002
- XML Articles and Papers August 2002
- XML Articles and Papers July 2002
- XML Articles and Papers April - June, 2002
- XML Articles and Papers January - March, 2002
- XML Articles and Papers October - December, 2001
- XML Articles and Papers July - September, 2001
- XML Articles and Papers April - June, 2001
- XML Articles and Papers January - March, 2001
- XML Articles and Papers October - December, 2000
- XML Articles and Papers July - September, 2000
- XML Articles and Papers April - June, 2000
- XML Articles and Papers January - March, 2000
- XML Articles and Papers July-December, 1999
- XML Articles and Papers January-June, 1999
- XML Articles and Papers 1998
- XML Articles and Papers 1996 - 1997
- Introductory and Tutorial Articles on XML
- XML News from the Press