The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: February 28, 2003
XML Articles and Papers February 2003

XML General Articles and Papers: Surveys, Overviews, Presentations, Introductions, Announcements

Other collections with references to general and technical publications on XML:

February 2003

  • [February 28, 2003] "Requirements for Format for Incident Report Exchange (FINE)." By Yuri Demchenko (NLnet Labs), Hiroyuki Ohno (WIDE Project, Japan), and Glenn Mansfield Keeni (Cyber Solutions Inc.). IETF Network Working Group, Internet Draft. Reference: 'draft-ietf-inch-requirements-00.txt.' Category: Informational. February 2003, expires August 2003. "The purpose of the Format for INcident report Exchange (FINE) is to facilitate the exchange of incident information and statistics among responsible Computer Security Incident Response Teams (CSIRTs) and involved parties for reactionary analysis of current intruder activity and proactive identification of trends that can lead to incident prevention. A common and well-defined format will help in exchanging, retrieving and archiving Incident Reports across organizations, regions and countries... Computer security incidents occur across administrative domains often spanning different organizations and national borders. Therefore, the exchange of incident information and statistics among involved parties and the responsible Computer Security Incident Response Teams (CSIRTs) is crucial for both reactionary analysis of current intruder activity and proactive identification of trends that can lead to incident prevention. In the following we refer to the information pertaining to an incident as an Incident Report. Actually Incident Report created and handled by CSIRT may have internal proprietary format that is adopted to local Incident handling procedure and used Incident Handling System (IHS). It is intended that exchange of Incident information will be conducted in a common format referred in this document as Format for INcident report Exchange (FINE). This document defines the high-level functional requirements to the FINE intended to facilitate collaboration between CSIRTs and parties involved when handling computer security incidents." Related references: (1) IETF Incident Object Description and Exchange Format (IODEF); (2) "Incident Object Description and Exchange Format Data Model and Extensible Markup Language (XML) Document Type Definition"; (3) IETF Extended Incident Handling Working Group.

  • [February 28, 2003] "Web Services Security, Part 1. Securing Web Services." By Gopalakrishnan U and Rajesh Kumar Ravi (IBM India Software Labs). From IBM DeveloperWorks, Web Services. ['This article introduces various aspects of the Web Services Security Framework and provides step-by-step guidelines on how to write and deploy secure Web services applications using HTTP as the transport vehicle. This article explains three methods to secure Web services applications, namely, HTTP basic authorization, secure socket layer (SSL) over HTTP (HTTPS), and a hybrid of HTTP basic authorization with HTTPS. The second article in this series will provide an introduction to the WS-Security model and how you can use it to secure your Web services applications.'] "e-business relies on the exchange of information between business partners over a network. In such a setup, as the information/data travels from the source to the destination, there is always the risk of the data being stolen or modified. The same security risks are applicable to Web services transactions. In Web services transactions using SOAP, the data are passed between the service invoker and service provider as plain XML, so anyone who intercepts the messages can read the data that are exchanged. In the Web services scenario the security aspect is more complicated because: Soap messages can be sent using different transport applications or protocols like HTTP, SMTP, etc., which might have a very high security model to no security model at all. Hence, there is a need for a comprehensive security framework for Web services applications, which is common to all types of transport protocols. There could be legitimate intermediaries that might need to access a part or whole of the SOAP message, and even modify the message. Thus the security model must be comprehensive enough to allow such intermediaries. This article will introduce the various aspects of Web services security and will provide step-by-step guidelines on how to write and deploy secure Web services applications with existing technologies... The security methods discussed provide simple, but effective, solutions to secure your Web services transactions over HTTP. However, a word of caution. As the complexity of Web services transactions increases, these methods may become unsuitable. Consider, for example, that the methods discussed above would be unsuitable if the same SOAP messages were exchanged using SMTP (Simple Mail Transfer Protocol). Similarly, the solutions presented above might not be applicable if there were legitimate intermediaries present. In order to address these issues, a comprehensive WS-Security specification is being developed and standardized. The second article in this series will introduce the WS-Security specification and provide a detailed account of how it can be used to take the security of your Web services applications to the next level..." Related references at "Security, Privacy, and Personalization."

  • [February 28, 2003] "WS-I Announces Board Nominations." By Darryl K. Taft. In eWEEK (February 26, 2003). "The Web Services Interoperability Organization (WS-I) Wednesday announced the nominations for two openings on the organization's board of directors. WS-I officials said representatives from Cape Clear Software Inc., Nokia Corp., SeeBeyond Technology Corp., Sun Microsystems Inc., VeriSign Inc. and webMethods Inc. have been nominated for election to the organization's board of directors. The limited number of nominees means a better possibility for Sun to join the board, where many -- including company insiders -- have, since the inception of WS-I a year ago, said Sun rightfully belongs. Sun cleared the way for joining WS-I in October after much wrangling, finger pointing and murmuring regarding Sun's role as an innovator in the world of Web services. WS-I officials said elections will be held in mid-March, with the results being announced March 28 and the new directors beginning their one- to two-year terms on April 1, 2003. A WS-I spokeswoman explained that the newly elected board members would have tenure of one or two years. The usual term for an elected board member is two years, and founding members such as IBM Corp. and Microsoft Corp. are permanently in position. 'The basic story is that there will be two directors elected,' a WS-I spokeswoman said. 'The one with the fewer votes during the March election receives a 12-month term. The one with the most votes receives the standard 24-month term. The usual term for an elected director is 24 months. This start up aberration is in place so that we can establish a staggered election schedule were one director position is filled each year.' The nominated companies and individuals are: Jorgen Thelin, chief scientist at Cape Clear; Juhani Murto, senior manager of Web services architecture at Nokia; Ugo Corda, principal standards analyst at SeeBeyond; Mark Hapner, distinguished engineer and chief Web services strategist at Sun; Sundar Krishnamurthy, a product manager at VeriSign; and Andy Astor, vice president of Enterprise Web Services at webMethods..." "Web Services Interoperability Organization (WS-I)."

  • [February 27, 2003] "Topic Map Constraint Language Requirements." By Graham Moore and Mary Nishikawa. 2003-02-23. ISO/IEC JTC 1/SC34 N0xxx. Work product from ISO/IEC JTC 1/SC34 Information Technology -- Document Description and Processing Languages. Reference posted to the '' mailing list on 2003-02-28 by Mary Nishikawa with Subject "TMCL Requirements Draft for Review." Supersedes the earlier document "Draft requirements, examples, and a 'low bar' proposal for Topic Map Constraint Language (TMCL)" ISO/IEC JTC 1/SC34 N226, 2001-05-24. This is the first public working draft for TMCL Requirements. "The Topic Map Constraint Language (TMCL) will provide a means to express constraints over and above the constraints currently defined by the Standard Application Model (Data Model for Topic Maps). Its goal will be to provide a language such that a topic map can be said to be conforming within a topic map or within a class of topic maps. It may help optimization both of topic map storage and of TMQL queries based on schema information. It may provide more intuitive user interfaces for creating and maintaining topic maps. This document will define the requirements that we expect the language will provide. TMCL is to be developed by ISO/JTC SC34 WG3 and this requirements document is for members of the committee and implementers who have expressed an interest in the development of a Topic Map Constraint Language... TMCL must define a standard way to explicitly indicate how topic map constructs are to be constrained. It should specify the topic characteristics a topic will have and the kinds of structures an association will have. TMCL shall be specified in terms of SAM (Standard Application Model for Topic Maps), a data model, and will not be specified on any serialization format for topic maps. This will automatically allow it to support both XTM & HyTM s well as LTM and AsTMa=, since these all have mappings to SAM..." Related references: (1) The Topic Map Constraint Language website; (2) Topic Map Constraint Language mailing list 'tmcl-wg'; (3) "Guide to the Topic Map Standardization; (4) "ISO SC34 Publishes Draft Reference Model for Topic Maps (RM4TM)"; (5) general references in "(XML) Topic Maps."

  • [February 27, 2003] "XML Puts Content In Play. Merrill Lynch, Aliant Telecommunications and the U.S. Navy are Among the Innovators Moving Extensible Markup Language from Concept to Reality." By Bill Trippe. In Transform Magazine (March 2003). "XML emerged in the late 1990s as the all-purpose solution to your technology woes. Have a content management problem? XML will solve it. Have an application integration problem? XML to the rescue! Have a legacy system you need to get to the Web? Three guesses as to what will solve your problem, and the first two don't count. In other words, XML has been overhyped. At the same time, it seems to be everywhere. Developers love it, and many now see it as an essential tool in their programming toolkits. Vendors are all over it, and it is core to strategies from the likes of Microsoft, Sun, Oracle and IBM. Moreover, the standards community is replete with XML-based initiatives. Clearly, there is some reality to XML that lies between information nirvana and nothing. Many organizations have actually put XML to work, and have learned valuable lessons on effective approaches and pitfalls to avoid. This article profiles four organizations that are successfully using XML in core business areas. These organizations represent both industry and government, and they're tackling applications ranging from content management to government compliance to billing and reporting. Some of the applications are departmental while others are enterprisewide. While the details of the customer's requirements and solutions differ, they share some characteristics: (1) All of the applications involved customer-facing Web sites or integration with Web-based content and applications; (2) Legacy systems needed to be preserved -- sometimes for the long run and sometimes just so the project could proceed quickly... The reality for these organizations was that business problems needed to be solved quickly against a backdrop of heterogeneous platforms, legacy systems and a growing audience of users only a browser away. While XML never solved the whole problem, it always emerged as a key element in the solution -- sometimes as source format, sometimes as a format for interchange. In every case, XML helped solve a practical business problem and wasn't used just because it was the fashionable choice. All the hype around XML is clouding the real success of the technology, which is often practical and powerful, yet, frankly, mundane. As one technologist interviewed for these stories remarked, XML is a means to an end, not the end itself..."

  • [February 27, 2003] "Oracle Adds Standard to Server Software." By Sandeep Junnarkar. In CNET (February 27, 2003). "Oracle on Thursday said its application server software now supports a popular e-business standard, making it easier for companies to conduct business over the Web. The business software company said its Oracle9i Application Server adopts standards defined by RosettaNet, an electronic business standards consortium. The consortium, which has more than 450 members, including Intel, Cisco Systems, Hewlett-Packard, Oracle, Microsoft and IBM, specializes in Web standards for exchanging data over the Internet using Extensible Markup Language (XML). RosettaNet, which was founded in 1998, recently became a subsidiary of the Uniform Code Council, which develops standards for the retail industry. RosettaNet is one of several organizations defining XML and Web-based standards for electronic commerce applications. Others include the Organization for the Advancement of Structured Information Standards, or OASIS, along with the World Wide Web Consortium and the Web Services Interoperability Organization..." See the news story of 2003-02-26: "RosettaNet Software Interoperability Trials Test RNIF Connectivity Software." General references: "RosettaNet."

  • [February 26, 2003] "XML Matters: Kicking back with RELAX NG, Part 1. Doing Better Than the W3C XML Schema." By David Mertz, Ph.D. (Idempotentate, Gnosis Software, Inc). From IBM developerWorks, XML zone. February 2003. ['RELAX NG schemas provide a more powerful, more concise, and semantically more straightforward means of describing classes of valid XML instances than do W3C XML Schemas. The virtue of RELAX NG is that it extends the well-proven semantics of DTDs while allowing orthogonally extensible datatypes and easy composition of related instance models. David Mertz takes a first look at RELAX NG in this, the first installment of a three-part series.'] "I have long been wary of W3C XML Schemas, and to some extent of XML itself. A jumble of companies and groups with divergent interests and backgrounds cobbled together the W3C XML Schema specification by throwing in a little bit of everything each party wanted, creating a typical committee-designed, difficult-to-understand standard. In fact, I have so many reservations that I generally recommend sticking with DTDs for validation needs, and filling any gaps strictly at an application level. About a month ago, however, I started taking a serious look at RELAX NG. Like many readers, I had heard of this alternative schema language previously, but I had assumed that RELAX NG would be pretty much more of the same, with slightly different spellings. How wrong I was. RELAX NG is simply better than either W3C XML Schemas or DTDs in nearly every way! In fact, RELAX NG's ability to support unordered (or semi-ordered) content models answers most of my prior concerns about the mismatch between the semantic models of OOP datatypes and the linearity of XML elements. This article is the first of three XML Matters installments that discuss RELAX NG. This installment will look at the general semantics of RELAX NG, and touch on datatyping. The second installment will look at tools and libraries for working with RELAX NG. The final installment will discuss the RELAX NG compact syntax in more detail... The semantics of RELAX NG are enormously straightforward -- in this respect, they are a natural extension of DTD semantics. What a RELAX NG schema describes is patterns that consist of quantifications, orderings, and alternations. In addition, RELAX NG introduces a pattern for unordered collection, which neither DTDs nor W3C XML Schemas support (SGML does, but less flexibly than RELAX NG). Moreover, RELAX NG treats elements and attributes in an almost uniform manner. Element/attribute uniformity corresponds much better with the conceptual space of XML than does the rigid separation in both DTDs and W3C XML Schemas. In actual design, the choice between use of an attribute and an element body is frequently underdetermined by design considerations and/or is contextually sensitive..." Also in PDF format. General references in "RELAX NG."

  • [February 26, 2003] "Relax NG." By Eric van der Vlist. Website for the online book. 2002-2003. Work in progress, with substantial content as of 2003-02-26. "Relax NG is a book in progress written by Eric van der Vlist for O'Reilly and submitted to an open review process. The result of this work will be freely available on the World Wide Web under a Free Documentation Licence (FDL). The subject of this book, Relax NG (, is a XML schema language developped by the OASIS RELAX NG Technical Committee and recently accepted as Draft International Standard 19757-2 by the Document Description and Processing Languages subcommittee (DSDL) of the ISO/IEC Joint Technical Committee 1 (ISO/IEC JTC 1/SC 34/WG 1)..." See also: (1) "Document Schema Definition Languages (DSDL)"; (2) general references in "XML Schemas." General references in "RELAX NG."

  • [February 26, 2003] "Oracle App Server Certified for RosettaNet. Software Meets Standards for Electronic Sharing of Data." By Joris Evers. In InfoWorld (February 26, 2003). "Oracle's 9i Application Server is the first to meet compatibility standards for electronic sharing of business data as set by the RosettaNet consortium, Oracle said Wednesday. Version 9.0.4 of the 9i Application Server Release 2, scheduled to be out in the second quarter of this year, will offer integrated support for RosettaNet, Oracle said in a statement, confirming what it said at the Oracle World event late last year. Besides Oracle, another ten integration software vendors are testing their products in an interoperability testing program set up by Oracle in cooperation with RosettaNet, Oracle said. The RosettaNet protocol is based on XML (Extensible Markup Language) and is intended to allow companies to electronically link with suppliers and customers without the need to set up EDI (electronic data interchange) links, thereby reducing costs, resources and implementation time..." See the news story of 2003-02-26: "RosettaNet Software Interoperability Trials Test RNIF Connectivity Software." General references: "RosettaNet."

  • [February 26, 2003] "RosettaNet Members Trumpet Interoperability." By Clint Boulton. In (February 26, 2003). "Led by Oracle, members of the RosettaNet consortium have announced the successful completion of software interoperability tests. RosettaNet is a non-profit organization whose goal is to implement standards for supply-chain transactions on the Internet. Supported along with Oracle by such members as Microsoft, IBM and Intel, it hashes out business-to-business (B2B) compatibility standards that many high technology outfits are keen on adopting... Creating interoperability among RosettaNet members helps connect RosettaNet trading partners to reduce costs and deployment time. The news is indicative of the importance software infrastructure concerns are placing on established and emerging standards, which many in the industry believe will help companies get on the same page in terms of interoperability. Oracle scored an industry first Wednesday when its flagship Oracle9i Application Server was certified for implementations... webMethods said it is among the first RosettaNet solution providers and partners to successfully complete the RosettaNet Interoperability and RosettaNet Compliance Badge Programs, the latter of which gives customers an additional layer of confidence that the solution providers are fully compliant according to published RosettaNet Implementation Framework (RNIF) 2.0 and or specific Partner Interface Processes. Sterling Commerce has proven its ability to transmit business data and communicate with other companies using the RosettaNet Implementation Framework (RNIF) version 2.0. Tibco completed the RosettaNet Software Interoperability Trials and Vitria's Vitria's BusinessWare for RosettaNet 3.2 was found to support interoperability requirements as identified in the test plan..." See the news story of 2003-02-26: "RosettaNet Software Interoperability Trials Test RNIF Connectivity Software."

  • [February 26, 2003] "OASIS Takes On Reliability Spec for Web Services." By John Fontana. In Network World (February 26, 2003). "The Organization for the Advancement of Structured Information Standards (OASIS) on Wednesday said it is forming a Web Services Reliable Messaging (WS-RM) technical committee that will develop a specification to guarantee the delivery of messages between applications, especially those executing business transactions. WS-RM will include three types of delivery: guaranteed delivery, which means the message is delivered at least once; duplication elimination, which ensures the message is delivered at most once; and message delivery sequencing, which determines the order in which messages are delivered. Eventually, the specification will be integrated with Web Services Description Language (WSDL), which is used to describe how a Web service operates and would signal that an application has a reliable delivery capability. Besides the original companies that drafted the foundation specification in January, the WS-RM technical committee also includes Commerce One, Cyclone Commerce, IONA, SAP, See Beyond, webMethods and WRQ. Not part of the group, however, is BEA Systems, which is preparing to release a proprietary reliable messaging technology for Web services as part of an upgrade to its WebLogic Platform. Also missing initially are powerhouses IBM and Microsoft, which have been major players in crafting Web services standards. IBM has created a similar reliable messaging specification called HTTP-Reliable. HTTP is the transport mechanism for SOAP, and the IBM specification enhances HTTP to ensure that a sender is returned a response of 'undeliverable' if their message did not reach its destination. While the IBM specification is bound to HTTP, the WS-RM proposal allows for the use of any transport mechanism that can be bound to SOAP... The WS-RM work will dovetail with other Web services standard work, including OASIS groups working on Web Services-Security and Security Assertion Markup Language, which was recently approved as a standard. The WS-RM committee plans to take input from the World Wide Web Consortium's Web Services Architecture Working Group..." See details in "OASIS Members Form Technical Committee for Web Services Reliable Messaging." General references in "Reliable Messaging."

  • [February 26, 2003] "OASIS Eyes Web Services Messaging. IBM, Microsoft Not Behind Spec Yet." By Paul Krill. In InfoWorld (February 26, 2003). "OASIS will develop a generic model for ensuring reliable message delivery for Web services, but the initiative lacks the support thus far of industry stalwarts IBM and Microsoft... The newly formed OASIS Web Services Reliable Messaging (WS-RM) Technical Committee plans to establish a standard, interoperable way of achieving reliability at the SOAP messaging level and potentially with other messaging protocols. OASIS announced the effort on Wednesday. The WS-Reliability specification, from Sun Microsystems, Oracle, Fujitsu and others, will be submitted as input for the WS-RM Technical Committee and other contributions are welcome, OASIS said. Microsoft and IBM, however, are declining to participate in the OASIS effort for the time being, which raises questions on how successful an industry standards effort can be without their participation. Microsoft and IBM Wednesday released prepared statements about OASIS's announcement. 'Microsoft continues to view the ongoing community work in the area of Web services as important, but has decided not to join this OASIS technical committee,' the company said, declining to elaborate on its reasons. IBM released a prepared statement attributed to Steve Holbrook, program director, IBM emerging e-business standards: 'While IBM is not joining the effort at the beginning, we are confident that the industry will unite around a common standard, and we will be active participants. IBM was influential in many of the constructs of WS-Reliability based on our early work on ebXML. 'History shows that many companies provide input to important standards, resulting in a functionally-rich specification that satisfies the industry. We think that the final standard for message reliability will be based partially on WS-Reliability, and we expect that this standard will evolve to correlate with several other Web services specifications underway.' Sun officials last week challenged IBM and Microsoft to adopt a royalty-free stance on emerging Web services standards. Sun intends to promote this stance as it seeks a seat on the Web Services Interoperability Organization (WS-I) Board of Directors. Sun's Mark Hapner, distinguished engineer and chief Web services strategist, would hold that seat... Interoperability, ease of implementation and ease of use are fundamental goals of the OASIS messaging effort, according to WS-RM Technical Committee chairman Tom Rutt, of Fujitsu, in a prepared statement... The OASIS Reliable Messaging specification will address message persistence, acknowledgement and resending, elimination of duplicate messages, ordered delivery and delivery status awareness for sender and receiver applications. WSDL definitions will be provided for reliable messaging. Message formats will be specified as SOAP headers and/or body content..." See details in "OASIS Members Form Technical Committee for Web Services Reliable Messaging." General references in "Reliable Messaging."

  • [February 26, 2003] "Providing a Value Set For Use in UDDI Version 3." Edited by Claus von Riegen (SAP). With contributions from Tom Bellwood (IBM). TC Technical Note produced by the UDDI Specification Technical Committee. Document identifier: uddi-spec-tc-tn-valuesetprovider-20030212. Abstract: "Through the use of value sets in UDDI registries, businesses are able to find each other and the services that meet their needs. This document provides guidelines for providers of value sets on how to model, register, and validate their value sets for use in UDDI Version 3." Topic: "In UDDI, a value set represents a set of values that can be used to provide meaning or context to a UDDI entity. Category, identifier, and relationship type systems are all value sets. Value sets play an important role within UDDI, because it is through their use that businesses are able to find each other and the services that meet their needs.... This paper guides the providers of value sets in the creation of value set services and in the registration of the value sets and these external value set services, following the recommended policies outlined in Chapter 9 of the UDDI Version 3 Specification..." Note: A 2003-02-24 posting from Tom Bellwood and Luc Clément (Co-chairs, OASIS UDDI Specification TC) reported that this document had been approved as a UDDI Spec TC Technical Note. General references in "Universal Description, Discovery, and Integration (UDDI)."

  • [February 26, 2003] "Rights Management? Or Restriction? Don't Be Fooled. Windows Rights Management Isn't About Safeguarding Your Rights." By Mary Jo Foley. In Ziff Davis Microsoft Watch (February 25, 2003). "If you were to go by the majority of headlines last week, you might be fooled. Press reports crowed: 'Microsoft Boosts Rights Management' - 'Microsoft Adds Rights Management Protection for the Enterprise' - 'Microsoft to Release Document Protection Software.' Windows Rights Management: It's a good thing. Isn't it? If you are a big company or organization with lots of correspondence and documents you want to keep secret, Windows RM is, indeed, a blessing. If you are a whistleblower, a journalist, a lawyer, a cop -- or anyone who has the audacity to want to use software other than Microsoft Windows or Office -- you should be very afraid... To me, RM, first and foremost, is an attempt by Microsoft to further lock customers in by requiring them to use Windows clients, Windows servers, Microsoft Office and Internet Explorer in order to create and consume documents. RM has another benefit, which I am not the first to note: It will eliminate the e-mail and document trails that hurt Microsoft in antitrust court..." See details in the 2003-02-25 news story "Microsoft Announces Windows Rights Management Services (RMS)."

  • [February 25, 2003] "Design XML Schemas Using UML. Translating Business Concepts Into XML Vocabularies." By Ayesha Malik (Senior Consultant, Object Machines). From IBM developerWorks, XML zone. February 2003. '[Unified Modeling Language (UML) is an industry standard that is used in modeling business concepts when building software systems in an object-oriented manner. Recently, XML has gained ground in becoming a key enabler of these systems in terms of transport of information and commands. XML schemas, which are used to define and constrain the nature of XML exchanged, have consequently come into the limelight. This article discusses the use of UML in designing XML schemas and gives a hands-on approach for using the UML framework to create your XML vocabularies.'] "When using the UML framework for constructing XML schemas, you must consider three issues: (1) The complementarities between UML and XML schemas; (2) How to extend UML to capture all the functionalities provided by schemas; (3) The ability to engineer XML schemas from UML diagrams... Many large conglomerates -- such as SWIFT, which provides the electronic infrastructure for trading and settlement for 7,000 financial institutions around the world -- are using UML-to-XML schema conversion to design their XML documents. UML represents the easiest way of modeling business concepts, especially when they are domain-specific. It is natural to want to extrapolate and automate the process so that the transformation is clean and complete. For this purpose, I have discussed the use of XMI and the ability of products such as hyperModel to generate the XML schema from the XMI describing the UML meta model. However, the reader is cautioned to always refine and double-check the validity of the model. Even though the ability to completely map UML to XML schemas has not yet been perfected, UML is a good way to start the modeling of XML schemas in an object-oriented manner. If the trend towards creating tools -- both open source and vendor managed -- for automatic generation of XML schemas continues, UML class diagrams might become a standard way of incorporating business concepts into XML vocabulary. As XML becomes intrinsic to all parts of a software system -- from data exchange to Web services messages to description of build scripts -- a clean, concise way of modeling XML schemas becomes imperative. UML is a tried and tested modeling tool for object-oriented systems, and it is attractive for developers, business analysts, and vendors as a medium for designing XML schemas. I believe we will see increasing use of UML as industries and consumers begin to develop their ontologies and services using XML..." Related references: (1) "UML 2.0 Vote Highlights Upcoming OMG Standards Meeting. Orlando, FL, USA, March 24-28, 2003."; (2) "Mapping Between UML and XSD"; (3) "Conceptual Modeling and Markup Languages."

  • [February 25, 2003] "UDDI Finds a Role After All." By Keith Rodgers. From (February 20, 2003). "Two years after the fanfare of its introduction, UDDI adoption levels remain low. But users are beginning to see UDDI directories filling a practical role: adoption of UDDI is growing among web services pioneers; directory capabilities become more critical as the volume of services increases; but UDDI's origins have left design challenges; most users will acquire UDDI as a component of their web services platform... The latest specification of UDDI, version 3, has moved on from its B2B origins, adding features designed to meet users' needs for private registries. These include, for example, procedures for putting security keys into requests, or for enabling information transfer from one private registry to another. That said, the twists and turns in UDDI's evolution have also influenced its design and left some technical oddities. UDDI defines three registry components, which in layman's terms are akin to the Yellow Pages phone book -- or more precisely, to the trio of white, yellow and green pages. The white pages list companies' contact details and the key services they provide; the yellow pages categorize businesses using agreed taxonomies, including where they operate; and the green pages provide the technical data other companies need in order to take advantage of the services on offer. These three components will become useful to various individuals when organizations start to run between 20 to 50 services, suggests Mukund Balasubramanian, founder and chief technology officer of Infravio. Developers, for example, will require information about services as they're built -- what resources went in, what configurations were used and so forth. System administrators will want to look at the services from the perspective of how they were deployed -- which servers they're running on, for example, and what the loads are. The business user taxonomy, meanwhile, will help end users find the services they need... Increasingly, the question of how and when to adopt UDDI will be taken by the vendors rather than their customers. A private UDDI registry is already built into the latest release of IBM's WebSphere web service platform, and other vendors are not far behind. This may save organizations the trouble of getting their heads round what UDDI is or why it's important, but it will leave questions such as managing service quality and the degree of interoperability with other platforms unresolved. Ironically, those were the very same problems that stopped enterprises from eagerly adopting the B2B hubs of the dot-com boom. UDDI may have secured its place in the web services firmament by sidestepping such issues, but customers won't find it so easy to avoid facing up to them..." See "Universal Description, Discovery, and Integration (UDDI)."

  • [February 25, 2003] "XACML -- A No-Nonsense Developer's Guide." By Vance McCarthy. In Enterprise Developer News (February 24, 2003). "Earlier this month, OASIS adopted XACML 1.0 as its first cut at an open standard to help developers build interoperable access controls security for XML documents and end-to-end transactions. In general, XACML (the Extensible Access Control Markup Language) describes two key areas for security -- an access control policy language and a request/response language for two-way communications. At the root of XACML is a concern with access policies -- what XACML refers to as a Policy or a PolicySet. When XACML refers to 'policy,' it specifically means Authorization (AuthN) Policy. Each XACML policy document contains exactly one Policy or PolicySet root XML tag. A PolicySet is a container that can hold other Policies or PolicySets, as well as references to policies found in remote locations. A Policy represents a single access-control policy, expressed through a set of Rules. [Said OASIS XACML committee co-chair Hal Lockhart:] XACML defines and describes 'layering' between XML entities to clearly distinguish between security technologies that: (1) Create policy; (2) Collect the data required for policy evaluation; (3) Evaluate policy; and (4) Enforce policy. Why be so granular? One key answer is to enable interoperability for access control approaches, Lockhart said. 'While deployed systems may combine two or more of these into a single entity, the architecture maximizes flexibility. For example, a management tool from one vendor could generate policies that are evaluated by a product from another vendor,' he said. One early reaction from a web services software firm was also bullish on the opportunities for simplicity that XACML might provide. 'The XACML standard specifies how policies for information access can be communicated between systems in an XML format. Such a description can allow an application built by a developer to automatically discover the security policies in force to access the resource,' said Mukund Balasubramanian, CTO at Infravio, a provider of web services management and security software in Redwood City, California..." General references in "Extensible Access Control Markup Language (XACML)."

  • [February 25, 2003] "Let the Mobile Games Begin, Part 1. A Comparison of the Philosophies, Approaches, and Features of J2ME and the Upcoming .Net CF. [Wireless Java.]" By Michael Juntao Yuan. In JavaWorld (February 21, 2003). "Java 2 Platform, Micro Edition (J2ME) is by far the most advanced and successful mobile application platform available today. However, with mobile commerce growing into a multibillion-dollar industry, serious competition is on the horizon from Microsoft. Microsoft's latest mobile commerce offering is the .Net Compact Framework (.Net CF). What exactly is .Net CF? How does it measure up to J2ME? As Java developers, what can we learn from it to better compete? In this two-part series, Michael Juntao Yuan presents an objective and comprehensive comparison between the two platforms. If you work in a predominantly Microsoft shop, the .Net CF and Visual Studio .Net tools will definitely help you port enterprise applications to mobile devices. .Net CF leverages the large community of existing Windows developers and helps companies lower development costs. However, if you are in a heterogeneous environment or need a real pervasive solution that works on low-end devices, J2ME is the hands-down winner. In the enterprise world, important J2ME vendors opt for service gateway-based application paradigms, while .Net CF is still too young for any significant third-party mobile middleware to emerge..."

  • [February 25, 2003] "Ixiasoft Boosting XML Content Searching. Link with Microsoft Content Management Server Offered." By Paul Krill. In InfoWorld (February 25, 2003). "Ixiaosoft is integrating its XML searching capabilities into Microsoft Content Management Server 2002, Ixiasoft officials are announcing this month. The company also is announcing an upgraded version of its Textml Server content server, featuring WebDAV support. A beta version of the Ixiasoft Integration Kit for Microsoft Content Management Server is available now, with general release anticipated later this month. The integration enables Content Manager Server developers and integrators to deploy sites that take advantage of the IxiasoftTextml Server XML search technology. Textml Server is an XML content server designed to store, index, and retrieve XML content. It is an embeddable XML server for original equipment manufacturers and developers of document-centric XML applications, such as documentation management systems, wireless content publishing, and enterprise portals. The integration kit consists of .Net Composite Controls that can be dragged and dropped from the Visual Studio .Net toolbox into a Content Management Server 2002 site under development. Developers can provide search and sort capabilities on postings based on the content of placeholders, properties, and custom properties, according to Ixiasoft. Ixiasoft this Thursday will ship Version 2.3 of Textml Server, which features XML Name Space support. This functionality provides for a simple method for qualifying element and attribute names used in XML documents by associating them with namespaces identified by URI references, according to Ixiasoft. Also featured in Version 2.3 are a WebDAV (Web-based Distributed Authoring and Versioning) client and server. WebDAV provides extensions to HTTP to boost the exchange of documents over the Web... XMP (Adobe's eXtensible Metadata Platform) also is supported, designed to enable XML metadata to be embedded within application files, such as a PDF file..." See also: (1) "WEBDAV (Extensions for Distributed Authoring and Versioning on the World Wide Web"; (2) "Extensible Metadata Platform (XMP)."

  • [February 24, 2003] "A Conversation with Adam Bosworth." By Marshall Kirk McKusick (Queue) and Adam Bosworth (Chief Architect and Senior Vice President of Advanced Development, BEA Systems). In ACM Queue Volume 1, Number 1 (March 2003), pages 12-21. ['Bosworth is directly involved in shaping the future of Web Services... to press Bosworth for insights into the possibilities and hazards he sees ahead, Queue asked Marshall Kirk McKusick -- former head of the UC Berkeley Computer Systems Research Group (CSRG) -- to fire off a few questions regarding his greatest Web Services concerns. Besides overseeing the development and release of 4.3BSD and 4.4BSD, McKusick is also renowned for his work on virtual memory systems and fast file system performance.'] Excerpts: "XML gives you a coarse-grain solution that allows for communication efficiency. SOAP and WSDL, meanwhile, give you the loose coupling you need. But that's only true if when you change your implementation, you make sure none of your XML messages are changed, because that's where your public contract is established. Imagine, for example, that I changed my Web site such that it no longer used the HTML format. Let's say it did WML instead. My browser wouldn't end up being such a happy camper if I did that, would it? And in the same way, if I were to change an application such that it didn't do a particular grammar of XML anymore, the other applications I have to communicate with would suddenly get very unhappy. So it's critical that the thing in charge here is the WSDL and not the code... I bring that up simply because a lot of people building so-called Web Service solutions do it just exactly the other way around. They have you build your code and then they auto-generate the XML messages from a description of that code. But, sooner or later, the code is going to change. And when it does, all those auto-generated XML descriptions are going to start breaking things... Another challenge has to do with language. We don't have a good language today for dealing with XML and that's a real problem. We have more and more systems that use XML extensively -- either as metadata that either describes what to do or simply transmits data from one application to another. The first step in any of these exchanges is called 'binding,' which involves some tricky processing to turn the XML back into data structures the programming languages themselves can understand. So if you send me a purchase order in XML, the first thing I'll have to do is tell you how to turn it into a purchase order object. Now, we've already invested a lot of work into ways of handling that here at BEA and we think we've done a pretty good job. But ideally, you shouldn't have to do any of that at all. What you'd really like is for the language to be able to understand the XML document and extract the necessary information itself. In fact, ideally, the language would do even more than that. Because these messages are self-describing, you should also be able to query your own data structures. If someone sends you an XML document, you may want to query it to find out what things you want. And we don't support that today because languages aren't used to thinking about their own data structures as query-able objects. So I think the changes that are going to be driven by Web Services will result in a major language extension. And that will give us a language that not only understands the idea of self-describing documents but also actually is capable of querying them and treating them as data structures... ideally, from our point of view, we'd like to see this come about as an extension to Java... The biggest change I see is that we're moving away from a data-centric world to a message-centric world. Throughout the 90s, we witnessed the triumph of client/server computing. We also saw a vast number of changes in programming that made it easier to talk to databases. So a lot of that was about writing data-centric applications. And that's the classic two-tier model. But now we're moving to an n-tier model. And with an n-tier model, the real problems have to do with exposing too many specifics to systems outside your immediate family. Because when you do that, you break. Over the next 10 years, we're going to move toward communications that are message-oriented, with systems talking to each other through public contracts in asynchronous ways. After that, I think a lot of the changes we'll see will have to do with optimizing that communications scheme. Even today, we have customers asking us to move as many as 500,000 messages a second..."

  • [February 24, 2003] "An Open Web Services Architecture." By Stans Kleijnen (Vice President of Market Development Engineering, Sun Microsystems) and Srikanth Raju (Staff Engineer/Technology Evangelist, Sun/Microsystems). In ACM Queue Volume 1, Number 1 (March 2003), pages 38-46. Article subsections: Sun ONE: An Open, Standards-based Web Services Framework; Building Web Services with Sun ONE; Phases in Web Service Adoption; ebXML; Different Approaches to Web Services; Wireless Web Services. "The name of the game is web services -- sophisticated network software designed to bring us what we need, when we need it, through any device we choose. We are getting closer to this ideal, as in recent years the client/server model has evolved into web-based computing, which is now evolving into the web services model... we discuss Sun Microsystems' take on web services, specifically Sun ONE: an open, standards-based web services framework; we share with you Sun's decision-making rationales regarding web services, and discuss directions we are moving in... In phase three, the ultimate phase, customers and business partners will find and conduct business dynamically. To reach this phase, two key pieces of technology are required: federated identity services, such as from the Liberty Alliance, and public service registries designed with standards for real-world B2B, such as ebXML. In this phase, expect the completion of a public version of the UDDI registry server, the specification and availability of federated services, and an ebXML roadmap. The goal of the ebXML standardization effort -- which is driven by two UN organizations: The Organization for Advancement of Structured Information Standards (OASIS) and The United Nations Center for Trade Facilitation and Electronic Business (UN/CEFACT), as well as over 2,000 members -- is to build an open marketplace framework in which any business regardless of size can participate in a global electronic marketplace. Some ask why we need yet more standards when we already have SOAP, WSDL, and UDDI. But these standards do not provide all that's necessary for ad-hoc electronic business transactions. The closest to ebXML from the functionality standpoint is the old Electronic Data Interface (EDI), which is too heavy for many smaller business organizations. ebXML standardizes business processes such as purchasing, ordering, shipping, and payment, so they can be performed by machines without manual pre-configuration. It also allows business partners to choose quality of service in their message delivery. Secure and highly reliable message delivery is vital in many business transactions, for example in ordering $500 million worth of Korean merchandise, or executing a buy order of 10,000 Sun Microsystems shares. Basic SOAP does not address these security and reliability requirements. The Sun ONE Platform will soon include an ebXML appliance geared toward small businesses..."

  • [February 24, 2003] "Web Services: Promises and Compromises." By Ali Arsanjani, Brent Hailpern, Joanne Martin, and Peri Tarr (IBM). In ACM Queue Volume 1, Number 1 (March 2003), pages 48-58. ['What organizational structures will enterprises need to support web services integration with legacy applications or new business processes that span organizational silos?'] "Web services are the latest software craze: the promise of full-fledged application software that needn't be installed on your local computer, but that allow systems running in different environments to interoperate via XML and other web standards. Much of the hoopla surrounding web services revolves around the nirvana of inter-organizational distributed computing, where supply chains can be integrated across continents with applications built from small parts supplied on demand by various vendors. To get to this place, we need to chisel down current methods and build a component-based architecture of large-grained, message aware, enterprise scale, and highly re-configurable enterprise components exposed as web services. The time to start adoption is now, but start within the firewall, inside the enterprise, and work yourself outwards. This wisdom will insulate you from the as yet unresolved security, intellectual property exposure, and performance issues associated with exposing web services outside the enterprise. It also gives you ample time to establish standards and best-practices. Over the past two years, we have contributed to several component-based development and integration (CBDi) projects in the telecommunications, mortgage, financial services, government, and banking sectors. A key success factor in these projects, one of which we will discuss in this article, involved applying CBDi best-practices across the following five web service domains: (1) Organizational, including project management implications and education programs; (2) Methodology, including extending methods to provide full life cycle support for component-based development; (3) Architectural, including best-practices and issues in creating scalable architectures; (4) Technology implementation, which involves mapping a given design onto a technology standard such as Enterprise JavaBeans or .NET. (5) Infrastructure, including development tools, gateway servers, APIs, middleware, and browsers... The current generation of web service infrastructures and tools has the typical problems of early software. Both XML tagging and text representation cause a data size explosion compared with binary representations. XML data must be parsed when it is read into an application, to create the internal data representations. Further complicating performance is the need to read in and parse the tag definition set. Encryption and de-encryption also increase overhead. These performance issues will be addressed as the technology matures, but developers today can expect factors of 10 to 100 slowdown compared to conventional distributed computing operations... Enterprise architectures that capitalize on web service capabilities are evolving rapidly to assimilate assets into a dynamic structure of services on demand. New technologies and methods are maturing to achieve acceptable service level characteristics. One of the best ways to implement web services is to start with a component-based architecture of large-grained enterprise components that expose business process level services as web services. Start within the organization rather than exposing them externally. As you gain project experience and uncover best practices, get ready to migrate to a full service-oriented architecture that externalizes useful business services..."

  • [February 24, 2003] "The Deliberate Revolution: Transforming Integration With XML Web Services." By Mike Burner (Microsoft). In ACM Queue Volume 1, Number 1 (March 2003), pages 28-37. "The vast investment in Internet infrastructure and telecommunications over the past decade is making the unthinkable eminently achievable. Organizations can now retrieve up-to-the-minute data at run-time from from its canonical source, partners and customers. And where applications have traditionally bound functionality together, it now is practical to access application logic at run-time, from hosted services updated dynamically to keep current with evolving business processes. Parties must now agree on how to represent information, on protocols for retrieving and updating data, and on means of demonstrating the privilege to do so. Such are the necessities that gave birth to XML web services. The architecture of these services attempts to bridge myriad Internet systems, to let organizations ranging from small businesses to multinational enterprises to world governments communicate more effectively using programmatic interfaces to data and processes. Web services have generated much excitement, and vendors are scrambling to depict their platforms as the most compliant, mature, secure, or simply the most likely to crank out swell T-shirts. This article attempts to dive beneath the hype, examining how XML web services differ from existing architectures, and how they might help build customer solutions... Over the next few years, we will experience a groundswell of innovation in two key areas of web services technology. The first will be the definition and publication of XML schemas as the lingua franca of inter-component communications across the Internet. XML is the universal grammar upon which this language is being developed. The second innovation will be the definition of SOAP header elements that extend the power of web service messaging... As the Internet becomes the backbone for data and application integration, common schema for describing our world and our interactions will unblock the flow of information between organizations, and allowing us to communicate with a precision we have never known before. But this shift requires organizations, from small businesses to world governments, to reconsider how data and processes are managed. The software industry, meanwhile, must deliver on technology that allows people to express and manipulate the information that drives our businesses, our societies, and our social interactions. Web services promise to be central to every facet of the transformation..."

  • [February 24, 2003] "Documentum Adds Collaboration Tools." By David Becker. In CNET (February 23, 2003). "Software maker Documentum, a specialist in content management products [announces] the first products to integrate collaboration software the company recently acquired. The new eRoom Enterprise product will combine Documentum's content management tools -- which catalog, manage and reformat business content ranging from text documents to XML (Extensible Markup Language) data -- with the browser-based collaboration tools made by eRoom, a privately held software maker Documentum acquired late last year. The goal is to allow companies to use content management software as more than just a file clerk, keeping track of various types of content and repurposing them for different formats, said Whitney Tidmarch, vice president of product marketing. By providing a browser-based environment to view and manipulate that content, the eRoom component of the new package allows workers inside and outside the company to make better use of the content being managed. 'Our strategy as a whole has been to continue to broaden the types of content we allow companies to store in a universally accessible place,' Tidmarch said. 'Having an online meeting place or online virtual workplace where this kind of casual interaction can take place is something we haven't had, though. The eRoom software really fills in that part of the picture.' Documentum expects the combination of content management and collaboration tools will especially appeal to workers who have to deal with new corporate rules that govern how documents such as print reports and e-mail messages must be preserved, said Naomi Miller, director of product marketing for Documentum. Combining content management and collaboration tools is a growing trend in the industry, said Nick Wilkoff, an analyst with Forrester Research. But the combination is most valuable to Documentum as a way to introduce existing eRoom customers to the benefits of content management..." See the announcement: "Documentum Delivers eRoom Enterprise. Best-in-Class Collaboration Technology Fully Integrated with Leading Enterprise Content Management Platform."

  • [February 24, 2003] "Microsoft's Unified App Goals Comes Into View. XML Drives Development of Tools, InfoPath." By Ed Scannell, Mark Jones, and Paul Krill. In InfoWorld (February 24, 2003). "Through its fervent adoption of XML, Microsoft is edging closer to crystallizing its long-held dream of building bridges that foster seamless transport of data between its suite of desktop applications and back-end applications. With the delivery by mid-2003 of its much anticipated and newly named Office 2003 desktop suit, the second beta of which is due in March, the company will have established a vital piece of software that could significantly increase XML adoption across the industry. 'Most vendors are becoming much more XML-friendly and consequently it [XML] is changing the nature of applications vendors business,' said John Jerome, an analyst with The Yankee Group in Boston . 'If Microsoft continues to lace its Office applications with XML, most users will have a more seamless flow of information between the heavy-duty, back-office financial applications and desktop applications like Word and Excel.' Jerome and other industry observers believe that XML is starting to have a game-changing impact on application development and integration. Many think it will positively influence the fundamental economics of implementing enterprise solutions..."

  • [February 24, 2003] "Standards Emerge From Alphabet Soup." By Renee Boucher Ferguson. In eWEEK (February 24, 2003). Sidebar for Cover Story "Models Link Processes." "An important consideration when deploying BPM software is which standards to follow... There is a growing selection of languages and interfaces out there -- Web Services Choreography Interface, BPML (Business Process Modeling Language), BPEL (Business Process Execution Language), WSFL (Web Services Flow Language) and Xlang -- some of which are backed by competing forces. Once the standards get sorted out, however, these technologies should provide business analysts and software engineers with a view of how business processes perform in business-to-business scenarios. One standard with a lot of momentum is BPEL for Web Services. IBM, Microsoft Corp. and BEA Systems Inc. published BPEL last summer to provide a way to define a process and to define a way a Web service is ordered. IBM, in Armonk, N.Y., has offered two additional standards proposals -- Web Services Transactions and Web Services Coordination. BPEL appears to be the front-runner. The language amounts to the merger of the WSFL and Xlang specifications. Xlang, put forth by Microsoft, reflects the way the Redmond, Wash., company's BizTalk software works, while WSFL focuses on how IBM's MQSeries middleware works. At the same time, two other organizations are putting their combined muscle behind BPML, a metalanguage for the modeling of business processes. They are the Workflow Management Coalition, which counts as members IBM, BEA and Microsoft, as well as Sun Microsystems Inc. and Oracle Corp., and the Business Process Management Initiative..." From "Models Link Processes": "IT managers and line-of-business professionals alike have come to understand that implementing business process management software requires a process modeling step and a process optimization and management step. The former consists of breaking down the tasks of a particular business operation -- both conceptually and graphically -- and creating a model that details where processes touch systems, applications and, increasingly, people. BPM is the technical execution of that model. IT organizations have found that you can't have one without the other. While there's been a lot of process modeling going on for years, process management tools that can execute those models have come to light only over the past 18 to 24 months. These new tools are fundamentally different from previous software offerings in that they combine modeling capabilities with real-time management capabilities. This extends the concept of BPM across, and even outside, an enterprise. The latest wrinkle in BPM technology links the execution layer to a process control layer, which gives companies the ability to monitor and measure processes on the fly... According to Gartner, 55 percent of clients polled said using a BPM engine helped them to automate administrative tasks and reduce costs of transactions or a business event. In the same study, 70 percent said BPM improved coordination across departments or geographies, 70 percent said fewer people were needed to perform business tasks, and 85 percent said they reduced the steps in certain processes. Some 85 percent said they experienced quality improvement, fewer errors, higher productivity per person and a reduction in time to market. In setting up a BPM strategy, organizations can choose software from three constituencies -- pure-play BPM vendors, EAI (enterprise application integration) vendors and ERP developers. Although their software runs at the core of many businesses, enterprise software vendors such as Siebel Systems Inc., SAP AG and Oracle Corp. are the last ones to address BPM. Siebel and SAP have each announced integration infrastructures for defining, implementing, managing and monitoring business processes -- and initiating Web services... Siebel, of San Mateo, Calif., last fall announced its Universal Application Network initiative, which promises to provide prepackaged, industry-specific business processes. SAP, of Waldorf, Germany, is adding a BPM component to its recently unveiled Enterprise Services Architecture for Web services. Later this year, SAP will ship a product that combines an integration broker for XML-based message exchange and its BPM engine that enables the design, execution and monitoring of business processes..."

  • [February 22, 2003] "Some Rights Reserved: Cyber-Law Activists Devise a Set of Licenses for Sharing Creative Works. [Staking Claims.]" By Gary Stix. In Scientific American Volume 288, Number 3 (March 2003). "On December 16, 2002, the nonprofit Creative Commons opened its digital doors to provide, without charge, a series of licenses that enable a copyrighted work to be shared more easily. The licenses attempt to overcome the inherently restrictive nature of copyright law. Under existing rules, a doodle of a lunchtime companion's face on a paper napkin is copyrighted as soon as the budding artist lifts up the pen. No '©' is needed at the bottom of the napkin. All rights are reserved. The licenses issued through Creative Commons have changed that. They allow the creator of a work to retain the copyright while stipulating merely 'some rights reserved'... A copyright owner can fill out a simple questionnaire posted on the Creative Commons Web site ( and get an electronic copy of a license. Because a copyright notice (or any modification to one) is optional, no standard method exists for tracking down works to which others can gain access. The Creative Commons license is affixed with electronic tags so that a browser equipped to read a tag -- specified in XML, or Extensible Markup Language -- can find copyrighted items that fall into the various licensing categories. An aspiring photographer who wants her images noticed could permit shots she took of Ground Zero in Manhattan to be used if she is given credit. A graphic artist assembling a digital collage of September 11 pictures could then do a search on both 'Ground Zero' and the Creative Commons tag for an 'attribution only' license, which would let the photographer's images be copied and put up on the Web, as long as her name is mentioned. Lessig and the other cyber-activists who started Creative Commons, which operates out of an office on the Stanford campus, found inspiration in the free-software movement and in previous licensing endeavors such as the Electronic Frontier Foundation's open audio license. The organization is receiving $850,000 from the Center for the Public Domain and $1.2 million over three years from the John D. and Catherine T. MacArthur Foundation... Some legal pundits will question whether an idea that downplays the profit motive will ever be widely embraced. Creative Commons, however, could help ensure that the Internet remains more than a shopping mall..." General references in "Creative Commons Project."

  • [February 22, 2003] "Common Alerting Protocol - Alert Message Data Dictionary." Produced by the Common Alerting Protocol (CAP) Technical Working Group. Version 0.7. Draft 2/22/2003. Version 0.7 represents a minor update from version 0.6, incorporating experience from several prototype implementations and field trials, and insights obtained in discussions among the Working Group. "The Common Alerting Protocol is a draft specification of open, nonproprietary, standards-based data formats for the exchange of emergency alerts and related information among emergency agencies and public systems. The CAP will be designed to facilitate the collection and relay of all types of hazard warnings and reports. Development and deployment of a standard such as CAP will yield important benefits for public safety: (1) Warnings to the public will be better coordinated across the wide range of available warning and notification systems (2) Workload on warning issuers will be reduced, since a single warning message will be compatible with all kinds of warning delivery systems. (3) Overall 'situational awareness' will be enhanced, since CAP will permit the aggregation of all kinds of warning messages from all sources for comparison and pattern recognition. The Common Alerting Protocol has been under development since 2001 through the efforts of an international ad-hoc Working Group of technical and public safety experts." A posting of February 22, 2003 from From: Art Botterell to the CAP mailing list: "I'm hoping this version will offer a point of departure for the OASIS Emergency Management Technical Committee standards process in the near future..." See also the (CAP) Working Group Working Documents Related references: (1) "Common Alerting Protocol (CAP) Provides XML Interchange Format for Public Safety Reports"; (2) "XML and Emergency Management." [cache]

  • [February 22, 2003] "Web Services Orchestration: A Review Of Emerging Technologies, Tools, and Standards." By Chris Peltz (Hewlett Packard, Developer Resource Organization). Technical Paper from HP Dev Resource Central (January 2003). 20 ages. "Web services technologies are beginning to emerge as a defacto standard for integrating disparate applications and systems using open, XML-based standards. In addition to building web services interfaces to existing applications, there must also be a standard approach to connecting these web services together to form more meaningful business processes. In 2002, a number of new standards were introduced to address this problem, including BPEL4WS and WSCI. The purpose of this paper is to provide a review of these emerging standards, to help the reader better understand how web services orchestration can be accomplished today. BPEL4WS primarily focuses on the creation of executable business processes, while WSCI is concerned with the public message exchanges between web services. WSCI takes more of a collaborative and choreographed approach, requiring each participant in the message exchange to define a WSCI interface. BPEL takes more of an 'inside-out' perspective, describing an executable process from the perspective of one of the partners. BPML has some complimentary components to BPEL4WS, both providing capabilities to define a business process. WSCI is now considered a part of BPML, with WSCI defining the interactions between the services and BPML defining the business processes behind each service. Orchestration, choreography, business process management, and workflow -- these are all terms related to connecting web services together in a collaborative fashion. The capabilities offered by web services orchestration will be vital for building dynamic, flexible processes. The goal is to provide a set of open, standards-based protocols for designing and executing these interactions involving multiple web services. To accomplish this, there are some basic requirements that have to be met. There is a need for asynchronous support in order to build reliability into the process. Strong transactional semantics and exception handling are required to manage both internal and external errors. There is also a need for a set of programming constructs to describe workflow. Finally, there has to be a way to link or correlate requests together to build higher-level conversations... It is unclear where the industry is going with the various standards. There is a fair amount of traction behind BPEL4WS from major players in the industry, and WSCI and BPML have already converged with each other... The ability to support a conversational model between web services is an important area that must be addressed by the web services standards. A conversational model for web services provides a more loosely coupled, peer-to-peer interaction model. These conversations involve multiple steps between parties, often involving negotiation between the parties. A peer-to-peer conversational model takes more of a third-person perspective, quite different from the standards presented in this paper. Even WSCI, which offers a somewhat collaborative model between web services, still takes a first-person perspective for any given WSCI document. [Hanson, Nandi, and Levine: 'Conversation-enabled Web Services for Agents and e-Business'] offers a good analogy to illustrate the difference between the business process standards and the conversational model for web services. The current web services model is analogous to a vending machine. There are a set number of buttons that can be pressed in a pre-defined order. A conversational model is more analogous to a telephone call, involving a series of exchanges between the parties at each end in a more flexible, dynamic fashion. At this time, IBM's Conversation Support for Web Services (CS-WS) is the only standard that claims to support this capability...

  • [February 22, 2003] "Exploring XML in Office 11. XML Capabilities in Store for Word and Excel Pack a Learning Curve." By Jon Udell. In InfoWorld (February 21, 2003). "Most business information lives in documents, not in databases. With the new XML features in Office 11, IT can start to bring database-like discipline to the creation and querying of those documents. For developers, schematization of business documents, such as resumes and expense reports, will be a long and gradual process. But Excel's new ability to read in and analyze XML data -- from XML-aware databases, Web services, and other sources -- will be immediately useful... This year's upcoming debut of Microsoft Office 11 will mark the start of a long process of education and adaptation... Here we explore how existing Office documents can benefit from the new features, how developers will prepare XML-aware Office templates, and how users will apply them to create and analyze XML data... After Office 11 ships, we face a classic chicken-and-egg scenario. Developers can't really learn the art of modeling data in business documents without user feedback. But users can't provide that feedback until they start actually working with XML-enriched documents. Office 11's XML support isn't a final solution. Rather, it allows for a long, difficult, and absolutely vital bootstrapping process. ... Nothing else in the Office suite will have anything like Excel's analytic prowess. Excel 11's newfound ability to absorb arbitrary schema-governed XML data, coupled with the explosion of XML data coming from everywhere -- Web services, XML-aware databases, the rest of the Office suite, and other emerging XML applications -- makes it more valuable. If you start with a raw XML file -- just data, no schema -- Excel will read the data and make a best-effort map to the grid. In the resulting worksheet, that data is immediately available for editing, sorting, charting, pivot-table analysis, and more. Of course when the data comes from a Web service, as it increasingly will, it is likely to be schematized. In that case, your options multiply. Once you associate a schema with the XML data, you can select elements shown in the XML Structure task pane. Under the covers, Excel creates the XPath queries that address those elements within the nested structure of the document. By dragging a set of selected elements to the worksheet, you create an XML data range that can absorb data from one or more XML files conforming to the schema..."

  • [February 22, 2003] "The .NET Classes StringInfo and TextElementEnumerator. These Two Members of the System.Globalization Namespace Handle Details of Unicode String Traversal." By Bill Hall. In MultiLingual Computing and Technology #54, Volume 14, Issue 2 (March 2003), pages 52-56. Programmers tend to become nervous when dealing with character encodings where a grapheme does not correspond exactly to a fixed width data element of storage... Fortunately, in today's object-oriented programming languages, it is possible to encapsulate methods and internal information into an object that handles Unicode string traversal. The programmer simply sets the starting position. From there, methods are available to move to the next character boundary and collect the one or more Unicode elements into an object representing the grapheme. No a priori knowledge about the nature of the elements and their relationship to other elements is required by the programmer. The class takes care of such details. In Microsoft.NET, the intricacies of string traversal are handled by the StringInfo and TextElementEnumerator classes. Both are members of the System.Globalization namespace. Although not yet implemented in Java, work is ongoing to develop comparable support since both .NET and Java have to support Unicode 3.2 for aesthetic as well as practical reasons. For example, the People's Republic of China now requires that software be able to manage all characters found in GB 18030-2000, of which some 40,000+ have Unicode equivalents that are surrogate pairs and thus require two Unicode elements per grapheme to encode... So, what are the issues involved in encoding graphemes in Unicode? The main ones are composite representations and surrogate pairs... .NET programmers should note that processing a Unicode string by moving from one Char element to the next may not be a suitable way of examining text if grapheme boundaries must be respected. For this purpose, use a combination of methods from the StringInfo and TextElementEnumerator classes..."

  • [February 22, 2003] "XACML Standard Controls Web Services Access." By Patricia Daukantas. In Government Computer News (February 21, 2003). "The Organization for the Advancement of Structured Information Standards this week approved the Extensible Access Control Markup Language (XACML) specification for Web services and documents. All previous policy languages were in proprietary formats, said Carlisle Adams, co-chairman of the OASIS XACML technical committee. XACML will be transportable between systems. Adams, principal security architect for Entrust Inc. of Addison, Texas, said the committee started working on the new language specification nearly two years ago. 'XACML cooperates with another recently approved OASIS standard, Security Assertion Markup Language, to create an authentication architecture for Web services. SAML defines a syntax for expressing assertions, such as a computer user's job title or security clearance, Adams said. A rules engine with policy statements written in XAML could compare the SAML assertions with its policies to determine whether the user should see confidential information'... See the announcement and related news item "Sun Microsystems Releases Open Source XACML Implementation for Access Control and Security."

  • [February 22, 2003] "Altio Makes Front-End Integration Smarter." By James R. Borck (Infoworld Test Center). In InfoWorld (February 21, 2003). ['AltioLive 3.0 hits new heights for Web-based applications built on Web services and data-centric resources.'] "EAI has a reputation as a necessary evil for the costly task of integrating years of legacy applications and valuable data. As enterprises seek to expose these old resources via new technology, including Web services, the rift between functionality and the limitations imposed by current browser-based technologies is ever more obvious. Facing the challenge head-on is Altio with its release of the AltioLive 3.0 platform... AltioLive delivers a means of integrating enterprise resources into a single interface that can be flexibly customized and extended. The solution's cost is well below that of traditional EAI, and it's more interactive than straight portal solutions. Easily implemented, the platform and IDE reduces the technical expertise required to extend Web services and backend resources and reduces deployment concerns... This release of AltioLive includes an overhauled IDE, adds offline resynchronization, real-time services management, and shores up better compatibility with Web services and XML standards. Although we would prefer to see client availability for mobile devices and better integration for XML-based transactional security, Altio hits the mark, scoring our highest rating of Deploy. The AltioLive production environment comprises the Presentation Server, which is a middleware servlet platform that manages XML-based communication streams, and a fat client applet running in a Web browser on the end-user's system. The fat client is an interface-rendering engine. It uses XSL to format raw XML data from the Presentation Server for use in the browser application. Altio provides an in-memory database, using a customized DOM and XPath interpreter, that allows data to be shared and reused locally among applications without constantly repolling the server... Altio received a profile boost in January thanks to an alliance formed with Sun Microsystems. The Sun ONE Portlet Server will begin bundling the AltioLivePortlet Edition development environment, boosting Sun's platform with added real-time, cross-application functionality and adding the benefit of bandwidth conservation. In the end, we found AltioLive 3.0 to represent the strongest overall offering available in the rich Internet application space today. It offers great opportunity for bridging user interfaces with the worlds of EAI and Web services that should not be overlooked..."

  • [February 21, 2003] "XML Advanced Electronic Signatures (XAdES)." Edited by Juan Carlos Cruellas (UPC), Gregor Karlinger (IAIK) Krishna Sankar Cisco). W3C Note 20-February-2003. Version URL: Latest version URL: "XAdES extends the IETF/W3CXML-Signature Syntax and Processing specification into the domain of non-repudiation by defining XML formats for advanced electronic signatures that remain valid over long periods and are compliant with the European 'Directive 1999/93/EC of the European Parliament and of the Council of 13 December 1999 on a Community framework for electronic signatures' [EU-DIR-ESIG] (also denoted as 'the Directive' or the 'European Directive' in the rest of the present document) and incorporate additional useful information in common uses cases. This includes evidence as to its validity even if the signer or verifying party later attempts to deny (repudiates) the validity of the signature. An advanced electronic signature aligned with the present document can, in consequence, be used for arbitration in case of a dispute between the signer and verifier, which may occur at some later time, even years later. This note adds six additional forms to XMLDSIG: (1) XML Advanced Electronic Signature (XAdES): Provides basic authentication and integrity protection and satisfies the legal requirements for advanced electronic signatures as defined in the European Directive but does not provide non-repudiation of its existence; (2) XML Advanced Electronic Signature with Time-Stamp (XAdES-T): Includes time-stamp to provide protection against repudiation; (3) XML Advanced Electronic Signature with complete validation data (XAdES-C): Includes references to the set of data supporting the validation of the electronic signature (i.e., the references to the certification path and its associated revocation status information). This form is useful for those situations where such information is archived by an external source, like a trusted service provider. (4) XML Advanced Electronic Signature with eXtended validation data (XAdES-X): Includes time-stamp on the references to the validation data or on the ds:Signature element and the aforementioned validation data. This time-stamp counters the risk that any keys used in the certificate chain or in the revocation status information may be compromised. (5) XML Advanced Electronic Signature with eXtended validation data incorporated for the long term (XAdES-X-L): Includes the validation data for those situations where the validation data are not stored elsewhere for the long term. (6) XML Advanced Electronic Signature with archiving validation data (XAdES-A): It includes additional time-stamps for archiving signatures in a way that they are protected if the cryptographic data become weak..." See "XML Digital Signature (Signed XML - IETF/W3C)."

  • [February 21, 2003] "Microsoft Rolls Out Rights Management Software." By Peter Galli and Mary Jo Foley. In eWEEK (February 21, 2003). "Microsoft Corp. on Friday finally unveiled its plans to integrate digital rights management technology across its entire product lines. The Redmond, Wash., company announced Windows Rights Management Services (WRMS), a new technology for Windows Server 2003 that will help secure sensitive internal business information including financial reports and confidential planning documents. An early alpha version of WRMS will be available to select testers next week, with a broad beta of the product, formerly code-named Tungsten, being released in next quarter. The product will work with applications to provide a platform-based approach to providing persistent policy rights for Web content and sensitive corporate documents of all types... DRM technology enables content creators, such as record companies, to encrypt content and define who can decrypt it and how they can use it. Microsoft is counting on increasing adoption of the technology to help drive demand for many of its current and future products... Microsoft currently offers a DRM system, Microsoft Windows Media Rights Manager, which is being used by seven music and video subscription services. There has been much speculation about the future of that technology when the updated DRM server was announced. Mike Nash, who is the corporate vice president of Microsoft's security business unit and is spearheading the DRM strategy for all of its product lines, said Friday that WRMS does not share any code in common with the DRM platform that Microsoft currently includes in its Windows Media Series products. Instead, WRMS will rely on XrML (Extensible Rights Markup Language), an emerging standard for the expression of rights on digital content... When the second beta of Office 2003 ships in early March, Microsoft will also make available to testers new APIs in the Office suite that will turn on the Information Rights Management's DRM features that are being included in that product. Finally, later this spring, Microsoft is planning to make its WRMS software development kit available to third-party software vendors, corporate developers and systems integrators. Nash noted that Microsoft's existing Windows Media DRM and its newly introduced WRMS are just two components of this strategy. Microsoft and third-party software vendors who sign on to use WRMS are likely to add the technology to other enterprise and consumer products in the future..." See details in the 2003-02-25 news story "Microsoft Announces Windows Rights Management Services (RMS)." Related references: (1) "XML and Digital Rights Management (DRM)"; (2) "OASIS Rights Language"' (3) "Extensible Rights Markup Language (XrML)".

  • [February 21, 2003] "Microsoft Details Rights Management Tech. Server Add-On Enforces Protection Policies." By Stacy Cowley and Joris Evers. In InfoWorld (February 21, 2003). "Microsoft said Friday it is developing add-on security technology for its forthcoming Windows Server 2003 operating system software that will allow organizations to implement rights-management protections on corporate documents such as e-mail messages and data files. The Windows Rights Management Services (RMS) will be able to enforce protection policies by controlling which users can access specific content and what access rights they are granted. Companies will, for example, be able to restrict content copying, forwarding and printing in applications such as portal, e-mail and word-processing software. However, only users of Microsoft's most recent products will be able to fully take advantage of the technology. RMS relies on the proposed XrML (Extensible Rights Markup Language) standard, an XML-based (Extensible Markup Language) language that is heavily backed by Microsoft but has yet to attract broad industry support. While Office 11, Microsoft's Office update scheduled for mid-2003, supports XrML and will work with RMS, older versions of Microsoft Office won't work with the technology, including the currently available Office XP..." See references in previous bibliographic entry and in the 2003-02-25 news story "Microsoft Announces Windows Rights Management Services (RMS)."

  • [February 21, 2003] "Mapping Between UML and XSD." By David Carlson (Ontogenics Corp). From XMLmodeling News Volume One, Issue Two (January 28, 2003). "One of the principal advantages of using UML when designing XML vocabularies is that the model can serve as a specification which is independent of a particular schema language implementation. W3C XML Schema is the most common choice right now, but we hope that business vocabularies (and other non-business technical markup languages) have a long life and will be implemented using alternative new schema languages. To achieve this goal, we need to define a complete and flexible mapping between UML and each implementation language. Given that UML was originally intended for object-oriented analysis and design, the mapping is most straightforward for languages that have an object-oriented flavor... A bi-directional mapping between UML and schemas is specified in the form of a UML Profile. The purpose of a UML profile for this or any other use is to extend the UML modeling language with constructs unique to an implementation language, analysis method, or application domain. The profile extension mechanism is part of the UML standard; it was expanded in the recent UML version 1.4 and will be further expanded when UML 2.0 is adopted this year. A UML profile (pre version 2.0) is composed of three constructs: stereotypes, tagged value properties, and constraints. A stereotype defines a specialized kind of UML element; for example, the XSDcomplexType stereotype defines a specialized kind of UML Class, and XSDschema defines a specialized kind of UML package. Tagged values define properties of these stereotyped elements. So the XSDschema stereotype includes a targetNamespace property. By assigning this stereotype to a UML package and setting a value for this property, we have augmented the UML modeling language with information used to generate a complete XML Schema document from an abstract vocabulary model. Similar stereotypes and properties are defined for all XML Schema constructs. A profile constraint specifies rules about how and where stereotypes and their tagged values can be used in a model. These rules should include what are often called co-constraints: how the value of one property constrains the values of other properties..." Related references in: (1) "XML Schemas"; (2) "Conceptual Modeling and Markup Languages."

  • [February 21, 2003] "WS-Security: New Technologies Help You Make Your Web Services More Secure." By David Chappell. In Microsoft MSDN Magazine (April 2003). ['Without good security, Web Services will never reach their potential. WS-Security and its associated technologies, the focus of this article, represent the future of security for Web Services. Provided here is an overview of these emerging security standards that explains what they do, how they work, and how they get along together. Topics discussed include integrity and confidentiality and how these are provided by public key cryptography, WS-Security, and more. Some of the key components of WS-Security, such as the wsu namespace, are also covered.'] "Web Services without effective security aren't very useful. Yet the original creators of SOAP chose to put off defining how this problem should be solved. This was a defensible decision, since getting Web Services off the ground meant keeping them simple -- and providing security is seldom simple. The problem, however, can't be put off forever so Microsoft and IBM, among others, are working together to address this issue. Their efforts have resulted in a group of specs for providing Web Services security, the most important of which is WS-Security. With this article I'll provide a big-picture view of how these technologies work. From one perspective, the task facing the creators of Web Services security looks simple. After all, effective mechanisms already exist for distributed security, including Kerberos, public key technologies, and others, so the task these creators faced wasn't inventing new security mechanisms. Instead, their goal was to define ways to use what already existed in a Web Services world, a world built on XML and SOAP... The fundamental technology for adding security to SOAP is defined by WS-Security. Its ambitious goal is to provide end-to-end message-level security for SOAP messages, and yet the WS-Security spec isn't especially hefty, weighing in at just over 20 pages. This is because WS-Security defines very little new technology but instead defines a way to use existing security technology with SOAP... Effective security for Web Services is essential. Given that most of what's needed is already in place (and the problem is simply a matter of mapping this existing security technology to XML and SOAP), you might think that WS-Security and its associated specs would be very simple (and this article would be very short). Yet accommodating the diverse security mechanisms in use today, along with allowing for those that will appear tomorrow, requires a nontrivial set of technology. Once this technology is in place, the world of secure and interoperable Web Services that we'd all like to see can become a reality. I don't know about you, but I can't wait for this day to arrive..." General references in "Web Services Security Specification (WS-Security)."

  • [February 21, 2003] "Web Map Context Documents." Edited by Jean-Philippe Humblet (IONIC Software sa). Request for Comment, OpenGIS Implementation Specification. From Open GIS Consortium Inc. Version 0.1.7, 2003-01-21. Reference number: OGC 03-036. 25 pages. Submitted to the GIS Consortium Inc. as a Request For Comment (RFC) by: Ionic Software (Belgium); GeoConnections / Natural Resources Canada; US National Aeronautics and Space Administration; DM Solutions; Social Change Online; Syncline. Annex A.1: Web Map Context Document XML Schema; A.2: Web Map Context XML Example. "This specification applies to the creation and use of documents which unambiguously describe the state, or 'Context,' of a WMS Client application in a manner that is independent of a particular client and that might be utilized by different clients to recreate the application state. This specification defines an encoding for the Context using Extensible Markup Language. This specification is relevant to Clients of the OGC Web Map Service (WMS 1.0, WMS 1.1.0, WMS 1.1.1)... This document is a companion specification to the OpenGIS Web Map Service Interface Implementation Specification version 1.1.1, [which] specifies how individual map servers describe and provide their map content. The present Context specification states how a specific grouping of one or more maps from one or more map servers can be described in a portable, platform-independent format for storage in a repository or for transmission between clients. This description is known as a 'Web Map Context Document,' or simply a 'Context.' A Context document includes information about the server(s) providing layer(s) in the overall map, the bounding box and map projection shared by all the maps, sufficient operational metadata for Client software to reproduce the map, and ancillary metadata used to annotate or describe the maps and their provenance for the benefit of human viewers. A Context document is structured using eXtensible Markup Language (XML). Annex A of this specification contains the XMLSchema against which Context XML can be validated. There are several possible uses for Context documents: (1) The Context document can provide default startup views for particular classes of user. Such a document would have a long lifetime and public accessibility. (2) The Context document can save the state of a viewer client as the user navigates and modifies map layers. (3) The Context document can store not only the current settings but also additional information about each layer (e.g., available styles, formats, SRS, etc.) to avoid having to query the map server again once the user has selected a layer. (4) The Context document could be saved from one client session and transferred to a different client application to start up with the same context. Contexts could be cataloged and discovered, thus providing a level of granularity broader than individual layers... This document directly supports only persistence of portrayed maps created by one or moreWeb Map Server bindings, but provides an extensibility mechanism for Web Feature Server and other types of service and persisted interface object states. It is expected that these will be developed and added to the formal schema in a backward compatible fashion. A specified filename extension such as .cml and .ccml has been proposed for Contexts and Context Collections. This may be added to the section dealing with Mime Type..." See the announcement: "OGC Seeks Comment on Proposed Web Map Context Specification." [cache]

  • [February 21, 2003] "Sun Trumpets Royalty-Free Web Services Specs. Company Challenges Microsoft, IBM." By Paul Krill. In InfoWorld (February 20, 2003). "Royalty-free industry specifications are needed to enable Web services to fulfill its potential as a mechanism for business process integration on a massive scale, Sun officials stressed during a Sun 'Chalk Talk' session in San Francisco on Thursday. Any requirement that specific vendors be paid royalties for use of their technologies in standardized Web services specifications could stifle the growth of Web services, said Mark Bauhaus, Sun vice president of Java Web services. Sun wants its royalty-free position to be accepted by other members of the Web Services Interoperability Organization (WS-I) and is running for election to a seat on the WS-I governing board in March. Specifically, Microsoft and IBM need to embrace royalty-free Web services, specifications, according to Bauhaus. With the vast increase in devices accessing the Internet, which could eventually number into the billions, and the low cost of Internet access, Web services are poised for dramatic growth as a business process integration mechanism for a variety of applications, Bauhaus said, but Web services must be royalty-free and based on open standards, and specifications must be converged. 'The headlines that we're writing now are about Web services. Is it going to be royalty-free or is someone going to hijack it?' Bauhaus said. He noted that IBM and Microsoft have produced a proposed specification for automating interaction between Web services, called Business Process Execution Language for Web Services (BPEL4WS). This proposal has not yet been submitted to a standards organization. Sun has a competing proposal, Web Services Choreography Interface (WSCI), being examined by the World Wide Web Consortium (W3C)... An IBM representative, in response to an inquiry Thursday about the company's stance on royalty-free Web services specifications, released this statement: 'IBM has already committed to royalty-free licensing on Web services specifications like SOAP, WSDL, and BPEL4WS, and we participate very actively in open-source implementations. We explore everything on a case-by-case basis.' A Microsoft representative released this statement Thursday: 'Microsoft's overarching goal is broad adoption of advanced Web services specifications. [Microsoft officials] can't make a blanket statement about licensing provisions as different specifications have different underlying technologies and different standards bodies have different licensing policies. Microsoft has made major technologies such as SOAP and WS-Security available without royalties and will continue to comply with the intellectual property licensing policies of the various standards bodies with whom we work.' Sun announced intentions to join WS-I last October after initially being shut out of the organization's formation a year ago. Microsoft and IBM were the major founders. WS-I is intended to be an open industry effort to promote Web services interoperability across platforms, applications, and programming languages..." See: (1) "Business Process Execution Language for Web Services (BPEL4WS)"; (2) "Web Services Interoperability Organization (WS-I)"; (3) "Patents and Open Standards."

  • [February 20, 2003] "Ten Things to Know About XDocs. InfoPath a Huge Step in the Right Direction. [Strategic Developer.]" By Jon Udell (InfoWorld Test Center). In InfoWorld (February 19, 2003). ['Jean Paoli, the architect of Microsoft Office's XML capabilities, recently spent several hours showing me Microsoft's newest Office family member, InfoPath (formerly XDocs, originally NetDocs). Here are 10 things you should know about this revolutionary piece of software.'] Excerpts: (1) You use it to gather and view semi-structured information: The most obvious example of such data-gathering is the business form. While acknowledging the marketing need to brand InfoPath as a forms application, Paoli insists -- rightly -- that there is more to the story. To the user, InfoPath is a general-purpose viewer and editor of business information. To the developer, it's a power tool for building applications that view, edit, and transform XML data. (2) Users create and maintain high-quality data: Like Office 11 and Excel 11, InfoPath can bind an XML Schema to a document, can interactively validate the document against the schema, and can prevent the user from saving the document in an invalid state. In addition to schema constraints, you can attach extra validation rules. For this purpose, the InfoPath design mode includes an XPath-aware expression builder. (3) It is aggressively standards-based: Word 11 can save formatted documents in an XML format called WordML, or it can save schematized data without formatting as generic XML. Although these two modes are both standard in their use of XML, they are nevertheless quite distinct from one another. In the latter case you use XSLT to apply the WordML styling to a core of pure structured data, but it's optional. With InfoPath, XSLT isn't an option. The document's core of structured data is always expressed through one or more views, and those views are XSLT transformations. InfoPath's more unified model does not derive from Word, but rather from Paoli's former project, Internet Explorer. In an InfoPath document, formatted text is expressed as XHTML (the schema for which must be bound to the document), and all styling is accomplished by means of standard CSS (Cascading Style Sheets). InfoPath also provides a DOM (Document Object Model) accessible to scripting languages such as VBScript and JavaScript...(8) It breaks the XSLT bottleneck: Even if you've drunk the XSLT Kool-Aid and know how a powerful a language it is, you'll probably admit that XSLT programming is no walk in the park. The InfoPath designer can, crucially, generate the XSLT code needed to map between complex XML data and useful views of that data. Like all visual tools it has limits, which you can escape from by defining regions within the generated code for handwritten extensions. That said, the designer works very hard to make intelligent mappings between XML structures and user-interface controls. Such automation should help prevent XML transformation from becoming the IT bottleneck of the Web services era..." See the full article for context. General references: "Microsoft Office 11 and InfoPath [XDocs]."

  • [February 20, 2003] "XP and XML." By Eric van der Vlist. From (February 19, 2003). ['Eric van der Vlist on the benefits of combining XML and Extreme Programming.'] "I am convinced that both XP and XML could benefit from working more closely together. And there may even be some hope for remote pair programming. I can't pretend to have real experience with XP but only with some of its practices, which I have been able to follow despite my remoteness. Therefore, most of this article is theoretical, but I hope that these ideas will still be useful... The XP practices rely heavily on communication between team members and a fluidity of the code that is continuously refactored. Communication and fluidity are where XML excels. One might think that implementers of XP tools and even XP users would have been eager to use XML pretty much everywhere. This is not really the case and even though a few XML applications have been developed for some practices, I haven't found any cross practices effort to define a XML framework that could facilitate XP as a whole... Being feebly tooled appears to be a more general characteristic of XP. During my research I was surprised to see that there doesn't seem to be an 'XP IDE' taking care of the twelve practices; on the contrary, XP developers rely on conventional tools: text editors, browsers, CVS repositories, Wiki-wiki webs, instant messengers or IRC. The two practices which are the most advanced in terms of tools are probably testing, with the development of the 'xUnit' test frameworks, and refactoring, traditionally strong in the Smalltalk world, which has greatly influenced XP... XP can be used right now to implement XML applications and it appears that this is a matter of mindset and tool availability more than any real incompatibility between XML and XP. It would definitely be nice if XML developers, designers and architects had more of a 'XP mindset,' and were eager to follow the 'Simple Design' practice... Although the reluctance of the XP community toward XML noted last year by Leigh Dodds is persisting, there is a huge opportunity to make XML a technology of choice for implementing better tools for XP, and the XML community still has lots to learn from the XP community..."

  • [February 20, 2003] "An Introduction to the Relaxer Schema Compiler." By Michael Fitzgerald and Tomoharu ASAMI. From (February 19, 2003). ['An introduction to using the Relaxer schema compiler.'] "Relaxer is a Java schema compiler for the XML schema languages RELAX NG and Relax Core. Using the Document Object Model, among other APIs, Relaxer generates Java classes based on schemas. It can also create classes based on XML document instances. The classes that Relaxer generates provide methods that allow you to access instances that are valid with regard to the compiled schemas, for use in your own programs that rely on the generated classes... In addition to compiling schemas, Relaxer can also generate DTDs, RELAX NG schemas, Relax Core schemas, W3C XML Schema (WXS) schemas, and XSLT stylesheets. You can also create Java classes that support, among other things, SAX, Java Database Connectivity (JDBC), classic design patterns, such as composite and visitor, factories, and components for Enterprise JavaBeans, Remote Method Invocation (RMI), and the Java API for XML Messaging (JAXM). The generation of schemas, stylesheets, and Java classes will be covered in this article... We will present a series of brief examples. In order to run these, you will need the SDK for Java 2 version 1.4 or higher (the latest SDKs are available for download from Sun's Java web site). Relaxer requires JAXP (Java API for XML Processing), which comes with version 1.4. You can run Relaxer with earlier versions of Java, such as 1.3.1, but it requires a separate download and installation of JAXP. You will also need a copy of the latest version of Relaxer... Relaxer has many other features which we cannot cover here. This article only deals with a few of its fundamental features, including schema, stylesheet, and Java generation. To continue exploring Relaxer, visit the Relaxer site, where you will (soon) find a tutorial and reference manual..."

  • [February 20, 2003] "The Pace of Innovation." By Kendall Grant Clark. From (February 19, 2003). ['Kendall Clark says that despite XML's success, it still has problems.'] "For all of XML's undeniable success, there remain significant problems. The most conspicuous of these, if we take an historical view, is surely the state of tools meant to aid in the creation of XML by humans. Every programming language has one, if not many ways to create XML programmatically, many of them clever indeed. But the 'XML editor for humans' area remains underdeveloped, particularly if we judge maturity by reference to more than five years of complaints about and claims for more tool support, by both vendors and advocates alike. Have we all simply misjudged how hard it is to build XML tools, including editors, with which ordinary people can create XML content naturally and simply? Rick Jelliffe suggested that the state of tool support has left at least one class of XML users, those in publishing, unsatisfied: 'By making it easy for developers, we have a lot of software which is good for data transmission but still very little that is an advance for data capture'. To which Jonathan Robie responded by offering a rundown of available, relevant tools, including 'XML editors for writing structured documents, XForms for entering data in web browsers, and software for publishing XML...from databases'. Robie also mentioned the XML facilities in the forthcoming major release of Microsoft Office. So why are publishing users unhappy? Is their unhappiness, Robie asked, a 'matter of getting the tools written, as opposed to issues with XML per se?' There may be something wrong with XML per se in this regard. Or, more accurately, there may be something wrong with the way XML developers tend to think about XML and about creating it which doesn't mesh well with the ways in which ordinary people think about content and its creation. Simon St.Laurent expanded on this theme: A lot of developers see XML editing as filling structured containers with appropriate content, and the containers should more or less guide you as to the content. This can mean that a huge amount of detail needs to be dealt with at one pass, and it often has meant that developers create interfaces which are actually more difficult to use than paper forms. Leaving markup for later lets people focus on the information as they see it rather than forcing them from the outset into someone else's preferred boxes'... The economic perspective also implies a rather unsettling fact -- unsettling if you are, like me, a long time critic of Microsoft. It implies that the XML facilities in the next major release of Office are the very best, realistic hope for the future of the documents side of XML, at least in terms of mass market success. No other entity in the industry (in any industry, for that matter) is as able to swing mass numbers of computers users toward or away from specific technological solutions. That Microsoft has gained such an ability by virtue of its position as an adjudicated monopolist is in some very real sense beside the point. If you keep a flame burning for the XML-as-document position, outside of Microsoft and a few other notables like Topologi, it is and will likely remain slim pickings for some time..."

  • [February 20, 2003] "Trusted Archive Protocol (TAP)." By Carl Wallace (Cygnacom Solutions) and Santosh Chokhani (Orion Security Solutions). IEFT Internet Draft. Reference: 'draft-ietf-pkix-tap-00.txt'. February 2003, expires August 2003. "A Trusted Archive Authority (TAA) is a service that supports long-term non-repudiation by maintaining secure storage of cryptographically refreshed information. This document defines a set of transactions for interacting with a Trusted Archive Authority (TAA) and establishes a means of representing archived information... An archive token is an object generated by the TAA when data is submitted and accepted for archiving. The archive token is returned to the submitter and may be used to request retrieval or deletion of the archived data and associated cryptographic information. For purposes of future retrieval or deletion, applications may treat the archive token as an opaque blob. The archive token includes: submitter DN, timestamp token, TAA date and time upon submission and, optionally, tracking information. To verify the accuracy of information archived by the TAA, submitters MUST verify the contents of the archive token In some cases it may be desirable to accept archive submissions from clients that are not TAP-aware... Retrieval and deletion requests are likely to be relatively rare compared to submission requests. In the interest of supporting a broad range of submission clients, it may be desirable to support alternative archive submission formats, for example, an XML submission request. Non-TAP-compliant submission formats must not use TAP-defined transport layer type information..." Related, on timestamps: OASIS Digital Signature Services Technical Committee.

  • [February 20, 2003] "Identity Systems and Liberty Specification Version 1.1 Interoperability." [Edited by Paul Madsen.] A Liberty Alliance Technical Whitepaper. February 14, 2003. 15 pages. Document Description: Liberty and 3rd Party Identity Systems White Paper-07.doc. "Today, most enterprises, government entities and non-profit organizations have substantial investments in processes and infrastructures to maintain the integrity of their business systems. Much as the Internet has provided access to sources of information and the need to track in more detail the activities of members of these organizations, sharing electronic information about users of information is rising in the minds of the management ranks of these organizations. This has spawned the need to create circles of membership in groups that can validate identities of the consumers of information. As a result, new organizations are being formed by various profit, non-profit and governmental groups to address this need. The solutions that are being put forward by these groups provide opportunities to choose or integrate with a new class of service provider called the Identity Manager. This white paper seeks to address some of the emerging Identity Management technical approaches and how the latest version of Liberty Alliance Project specifications can co-exist with these other technical approaches. It is targeted to technical architects, project managers and other evaluators who are involved in building and maintaining identity applications and infrastructures... Network identity refers to the global set of attributes that are contained in an individual's various accounts with different service providers. These attributes include information such as names, phone numbers, social security numbers, addresses, credit records and payment information. For individuals, network identity is the sum of their financial, medical and personal data -- all of which must be carefully protected. For businesses, network identity represents their ability to know their customers and constituents and reach them in ways that bring value to both parties... Federated network identity and the infrastructures are driven by more than specifications alone. Liberty understands that all organizations will have multiple identity managers -- public, private or proprietary -- with whom it will have to coexist. Liberty Alliance is working to ensure that its specifications and deliverables will work with other existing and emerging organizations that will certify or authenticate network identity, most specifically in federated circles of trust..." See also the earlier 2003-02-07 reference "Liberty Alliance Releases ID Management Specification. White Paper Explains Possible Interoperability." General references in "Liberty Alliance Specifications for Federated Network Identification and Authorization." [cache]

  • [February 19, 2003] "Fear Not -- Web Services Demand Changes." By Eric Newcomer (Iona Technologies). In ZDNet News (February 19, 2003). "Open. Interoperable. Reusable. Flexible. -- are just some of the words that most software vendors have applied to describe the importance, appeal and future of Web services. Similarly, corporations have applauded the attempt by the software industry to finally standardize its offers, thereby eliminating the islands of information and systems that exist within nearly every Global 2000 organization... However, today we sit at a fork in the road of Web services evolution. There are two paths that the industry can take, with each path leading to a distinct and different future. One road leads to a truly standardized world where corporations fully reap the benefits of Web services by untangling the 'spaghetti mess' of IT systems. The second road leads back to yesteryear, where proprietary systems ruled the day, maximizing vendor service and maintenance revenue, and killing end-user flexibility and return on investment. To accomplish software-industry standardization, vendors have to shift focus from selling proprietary products with a 'standards compliant' label on them to focus instead on cooperating to create a larger market based on truly standardized products... The path we take to the future may well depend upon the outcome of the current standoff around intellectual property rights in two key areas: orchestration and reliable messaging. Some of the vendors developing specifications for these areas are raising the question of possibly charging patent or royalty payments for the rights to implement the specifications. The leading standards bodies, and traditional industry practice around software standards to date, tend to favor royalty-free implementations. What's needed is agreement that the potential Web services market is bigger than all of our current markets put together. Some vendors prefer to draw the standardization line where it currently sits, with SOAP, WSDL and UDDI. But customers correctly say that this isn't enough to realize the promise of Web services technology, and widely adopted, open specifications at the next level are necessary..."

  • [February 19, 2003] "Thinking XML: Universal Business Language (UBL). Examining What May Be The Crown Jewel of XML Business Formats." By Uche Ogbuji (Principal Consultant, Fourthought, Inc). From IBM developerWorks, XML zone. February 2003. ['Universal Business Language (UBL) is an ambitious effort to unify the chaotic world of XML formats for business. Recently, the group behind UBL released the first work products for public review. In this article Uche Ogbuji takes a first in-depth look at UBL.'] "The UBL TC recently produced the first major product for public review: UBL Library Content 0p70. You can download a review package from the home page. I encourage anyone who is interested to have a close look at UBL and send any comments to the UBL TC. This work is royalty-free and all attempts have been made to avoid intellectual property encumbrances on it. It is a rare public benefit to have such an important work so freely available, and we all have much to gain by advancing it. The materials in this release are not in final form: They are expected to be finalized around the middle of this year, and even then only a fraction of the eventual UBL material will have been completed, so there is plenty of time to comment and contribute. The first thing to note about UBL is that it is very large and very comprehensive. The initial release comes as a 5.6 MB ZIP file, and covers what are likely the most common business forms, and perhaps the most often rendered as XML documents: the trading cycle from order through invoice between buyer and seller. Specifically, it includes specifications for the following transactions: Order, Simple and complex order responses, Order cancelation, Despatch advice [(often known as a shipping notice], Receipt advice, Invoice. A set of basic business concepts provides the building blocks for the above specifications. These are called the Basic Business Information Entities (BBIEs) and are expressed as Core Component Types (CCTs) in a common UBL schema. In addition to the BBIEs, different specifications define their own specialized Business Information Entities (BIEs), which make up the UBL conceptual model, organizing business concepts into classes and associations. The UBL conceptual model is based on other modeling systems such as Unified Modeling Language (UML) and Entity/Relational modeling. In fact, UBL uses UML to provide a high-level view of the conceptual models... The last time I discussed UBL in this column, I wondered whether it could be The One that helps unify the chaotic world of XML vocabularies for business. UBL certainly has a lot of potential to be the dominant format. It is very rigorously defined, takes advantage of other work where appropriate, and encourages broad adoption and contribution by remaining royalty-free. It is also in very active development at a time when work on so many other such systems seem to have stagnated... [although] work is needed to extend UBL to all the many different types of transactions used in business, UBL offers many useful products that will be available for immediate use once it is finalized later this year: (1) A core set of transaction formats for an initial set of common business forms; (2) A reusable concept model and vocabulary; (3) Some design practices that borrow the best ideas from a variety of disciplines; (4) A good amount of code to bootstrap applications for processing UBL..." See other details in "UBL Technical Committee Releases First Draft of XML Schemas for Electronic Trade."

  • [February 19, 2003] "Another Way to Web-Enable Your Legacy Apps." By David Berlind. In ZDNet Tech Update (February 18, 2003). "Just when you think you've seen every possible approach for marrying legacy applications to thin clients, another vendor comes along with yet another way to skin that cat. Tarantella Enterprise 3 (TE3) may use techniques already found in other products, but it combines those techniques into a low-cost, three-tier architecture for Web-enabling just about any legacy app. UK-based Tarantella's TE3 is a master translator of the myriad protocols and platforms used to deliver many of today's legacy applications to their clients. TE3 can take most legacy terminal emulation and display protocols, Remote Desktop Protocol (RDP) for Windows Terminal Server, and any Java-wrapped application, and resolve them to one display protocol for shipment to a Java-enabled Web browser or Tarantella's own proprietary thin client. Tarantella has a thin client for just about any client platform, but company officials say that it's most useful for clients like PDAs that don't already have a Java-enabled browser... When considering something like TE3 to thin client-enable your Windows, Unix, mainframe, or Java applications, you face this question: Should you apply an easy-to-deploy stop-gap measure that wraps the native presentation layers of your existing applications into a secondary presentation layer common to all clients? The pursuit of thin-client commonality has led others down the XML path, where the native presentation layer is re-architected in a way that fully insulates the clients from the servers and, in true Web services style, often eliminates the need to distinguish between the two. With this approach, the economics of developing the various discreet objects can be spread equally across the clients, portals and other XML-enabled transaction-oriented servers. For example: Air2Web's Mobile Internet Platform asks you to recode your business logic into XML and to let them worry about the presentation layer regardless of the client... [Tarantella's Guy] Churchward admits that rebuilding from the ground up using XML is the trendy thing to do, but he argues that in the case of many legacy applications, re-architecting would be too costly or simply not possible. Could be. You decide..."

  • [February 18, 2003] "Extensible Access Control Markup Language (XACML) Version 1.0." Edited by Simon Godik (Overxeer) and Tim Moses (Entrust). Official (normative) OASIS Open Standard, 1.0. 18-February-2003. Produced by the OASIS Extensible Access Control Markup Language TC. 132 pages. Associated XML Schema Definitions: Policy Schema and Context Schema. "This specification defines an XML schema for an extensible access-control policy language." Description and overview is supplied in "XACML 1.0 Specification Set Approved as an OASIS Standard." See also the Errata document. Other references: (1) the announcement of 2003-02-18, "XACML Access Control Markup Language Ratified as OASIS Open Standard. Universal Language for Authorization Policy Enables Secure Web Services."; (2) "Sun Microsystems Releases Open Source XACML Implementation for Access Control and Security."; (3) general references in "Extensible Access Control Markup Language (XACML)." [cache]

  • [February 18, 2003] "A Revolution in Business Process Management?" By Dan Farber. In ZDNet Tech Update (February 17, 2003). "Intalio, a nearly four-year-old company specializing in BPM, may have cracked the code for ridding BPM of its rough edges and steep costs. The company's Intalio n-3 2.0 software can reduce the development cost of designing and implementing business processes by up to 75 percent, according to Ismael Ghalimi, Intalio co-founder and chief strategy officer... Intalio's visual development environment and ease of use are rooted in its support of emerging process modeling languages. Visual models are converted into BPML (Business Process Modeling Language) or BPEL4WS (Business Process Execution Language for Web Services) code. Ghamili estimates that 50,000 to 100,000 lines of J2EE code could be replaced by 100 lines of BPML code. Intalio was a co-founder of the organization that created BPML, hoping to establish a standard, platform-independent meta-language as a foundation for its products and the market. Subsequently, a consortium of IBM, Microsoft, and BEA systems developed BPEL4WS in competition with BPML. Intalio was smart to support both in its product. At some point the two modeling languages could merge into a single standard; with IBM, Microsoft and BEA driving BPEL4WS, it's not difficult to determine which faction is in the driver's seat. WSCI (Web Services Choreography Interface) is also in the picture. WSCI defines the interaction between services deployed across multiple systems, and facilitates interoperability between BPML and BPEL4WS, which define the business processes behind each service. Intalio, along with BEA, SAP and Sun, co-developed the WSCI 1.0 specification... Intalio appears to have made significant progress in delivering plug and play BPM. Now it's up to the market to decide if it can be the next Oracle..." See the announcement: "Intalio Ships Intalio|n³ 2.0. New End-user Interactivity Combines with Productivity, Performance, Standards Enhancements to Reinforce Dominance of Company's BPMS."

  • [February 18, 2003] "EXSLT By Example. How to Put the Community Standard XSLT Extensions to Work." By Uche Ogbuji (Principal Consultant, Fourthought, Inc). From IBM developerWorks, XML zone. February 2003. ['Community standards have had a very important role in XML technology, from SAX to RDDL. The most important community standard for XSLT is the EXSLT initiative for standard extension functions and elements. In this article, Uche Ogbuji uses practical examples to introduce and demonstrate some useful EXSLT functions.'] "... the EXSLT extensions make XSLT far more useful in general-purpose data-manipulation tasks. Just to select a few random examples: (1) The EXSLT Math module contains trigonometric mathematical functions, which makes it feasible to generate pie charts in SVG. (2) The EXSLT Regular Expressions module makes it easier to parse and process user input and other variable data sources. (3) The EXSLT Dates and Times module makes it easier to render Web pages with date-sensitive content, or to process data containing date fields... In these and many other real-life tasks, EXSLT makes using XSLT feasible in a way that is portable across the many processors that support the standard. In prior articles, I have already shown how EXSLT functions such as exsl:node-set and exsl:object-type are useful for even the most basic processing tasks. In this article I try to cover a cross-section of EXSLT's capabilities by solving a couple of simple and practical problems using EXSLT. I also try to avoid ground I've already covered to introduce the many useful EXSLT facilities that have not received the attention they deserve. If you are completely unfamiliar with EXSLT, please read the Kevin Williams article first. The resources in that article include pointers to XSLT processors that support EXSLT, which you should install so you can play with the examples and use EXSLT yourself... EXSLT includes tools for making things easier and others for making things possible in the first place. For example, the date:date-time function is impossible to replace in XSLT but dyn:evaluate() can usually be replaced by a system, whereas you use XSLT to generate another XSLT script which is what you actually run to get the desired result. EXSLT includes much more than I can hope to cover in future articles, but one nice thing about EXSLT is that it is very well documented. You'll find a world of riches to improve your XSLT development by checking out the Web site and taking advantage of this community standard..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [February 18, 2003] "Old School. 18th Century Theory is New Force in Computing." By Michael Kanellos. In CNET (February 18, 2003). "Thomas Bayes, one of the leading mathematical lights in computing today, differs from most of his colleagues: He has argued that the existence of God can be derived from equations. His most important paper was published by someone else. And he's been dead for 241 years. Yet the 18th-century clergyman's theories on probability have become a major part of the mathematical foundations of application development. Search giant Google and Autonomy, a company that sells information retrieval tools, both employ Bayesian principles to provide likely (but technically never exact) results to data searches. Researchers are also using Bayesian models to determine correlations between specific symptoms and diseases, create personal robots, and develop artificially intelligent devices that 'think' by doing what data and experience tell them to do... One of the more vocal Bayesian advocates is Microsoft. The company is employing ideas based on probability -- or 'probabilistic' principles -- in its Notification Platform. The technology will be embedded in future Microsoft software and is intended to let computers and cell phones automatically filter messages, schedule meetings without their owners' help and derive strategies for getting in touch with other people... Toward the end of the year, Intel will also come out with a toolkit for constructing Bayesian applications. One experiment deals with cameras that can warn doctors that patients may soon suffer strokes. The company will discuss these developments later this week at its Developer Forum... Probabilistic thinking changes the way people interact with computers. 'The idea is that the computer seems more like an aid rather than a final device,' said Peter Norvig, director of security quality at Google. 'What you are looking for is some guidance, not a model answer'..."

  • [February 18, 2003] "CSS3 Module: Color." Edited by Tantek Çelik (Microsoft Corporation) and Chris Lilley (W3C). With contributions by Steven Pemberton (CWI) and Brad Pettit (Microsoft Corporation). W3C Working Draft 14-February-2003. Version URL: Latest version URL: Previous version: "CSS (Cascading Style Sheets) is a language for describing the rendering of HTML and XML documents on screen, on paper, in speech, etc. To color elements in a document, it uses color related properties and respective values. This draft describes the properties and values that are proposed for CSS level 3. It includes and extends them from properties and values of CSS level 2... This WD document is a draft of one of the 'modules' for the upcoming CSS3 specification. It not only describes the color related properties and values that already exist in CSS1 and CSS2, but also proposes new properties and values for CSS3 as well. The Working Group does not expect that all implementations of CSS3 will implement all properties or values. Instead, there will probably be a small number of variants of CSS3, so-called 'profiles'. For example, it may be that only the profile for 32-bit color user agents will include all of the proposed color related properties and values... This is the second last call for this document, because, in response to comments on the previous draft, sufficient changes were made as to justify reissuing a last call before advancing the document to candidate recommendation. The three most significant changes: (1) Color names/values harmonized with SVG 1.0. There was substantial feedback on the previous draft that requested that CSS3 Color normatively list all the normative color names/values which are accepted by SVG 1.0. This document now contains the following additional named colors: darkgrey, darkslategrey, dimgrey, lightgray, lightslategrey, slategrey. Note that this change was also made in accordance with the TAG finding regarding Consistency of Formatting Property Names, Values, and Semantics. (2) Addition of the currentColor keyword, following SVG 1.0; (3) Optional <priority-index> has been dropped from opacity property and thus harmonized with SVG 1.0..."

  • [February 18, 2003] "Getting XML Into, and Out of, Quark XPress. Interactive Approach, Clean User Interface Distinguish Easypress RoundTrip. [Content Management.]" By George Alexander and Mark Walter. In The Seybold Report Volume 2, Number 21 (February 17, 2003). ISSN: 1533-9211. ['U.K. supplier Easypress has just introduced RoundTrip, which uses Quark's hidden-text feature to store XML structure within XPress documents. It lets you make (and preserve) changes in either environment, although there are a few limitations that future versions should address. Overall, we like it.'] "RoundTrip is designed for situations where there is already an XML source file (produced, perhaps, by writers working with an XML editor) and where one of the target outputs is Quark XPress. RoundTrip preserves the XML structure within XPress (using XPress' 'hidden text' feature to do so) so that the file can be modified in XPress and still be exported as valid XML. When XML files are brought into XPress, they can be can parsed to be sure they fit the specified DTD. Placement in the layout can be done interactively (drag and drop) or by the use of templates. The operator can also work with a 'placeholder' document that has been created to match a specific DTD. This allows the use of repeating templates, such as might be used for a directory or catalog. RoundTrip uses styling rules to spell out how an XML file will be formatted within XPress. For each XML tag, the rules specify the paragraph and character styles to be used in XPress. They also permit arbitrary text to be placed before or after the element... The rules are themselves XML files, so they can be modified using an XML editor. Similarly, RoundTrip's 'preferences' file is an XML file. While working in XPress, operators can view the structure of an XML instance in the RoundTrip window at the same time that they see the placement of elements on the page. The XML structure cannot be changed within RoundTrip, but content edits made in XPress are preserved when the file is exported as XML From the vantage of what an end user sees, RoundTrip is the most attractive, interactive solution we've seen to the problem of integrating bidirectional XML support into XPress. The ability to make (and preserve) changes in either environment is a key benefit, and we like the way Easypress presents the XML structure to the RoundTrip operator. The pricing may make it unaffordable for some small organizations, but we expect medium and large groups that want to shift to an XML-upfront workflow, yet still edit content within XPress, will find RoundTrip very appealing..." [Some limitations: works only with DTDs, not XML schemas...] See other details in the announcement: "Roundtrip 1.0 Ships for Mac and Windows. Now XML Can Be Beautiful -- Single Source XML Publishing is Practical for the First Time in QuarkXPress."

  • [February 18, 2003] "XML Standard Set for Secure Web Services." By Michael Hardy. In Federal Computer Week (February 18, 2003). "The Organization for the Advancement of Structured Information Standards (OASIS) today announced that its interoperability consortium has approved the Extensible Access Control Markup Language (XACML) as an OASIS open standard. XACML, a variant of Extensible Markup Language, allows Web developers to enforce policies for information access over the Internet. Its adoption as an OASIS Open standard means that agencies can implement it with the confidence that it will become widely used. As an open standard, no single vendor owns it and all developers can use it. The standard is designed for use in authorizing which individuals should have access to information, said Carlisle Adams of Entrust Inc., co-chairman of the OASIS XACML Technical Committee, in a statement. Authorization procedures developed based on XACML can be applied to all products that support the standard, regardless of which vendor makes them, allowing for organizationwide uniform enforcement. Agencies are interested in such standards because the needs of homeland security, electronic government and other initiatives are pushing agencies to share information while keeping it secure, said Jim Flyzik, a consultant and the newly appointed chairman of the Information Technology Association of America's Homeland Security Task Group. 'It is something we've talked about for quite some time,' he said. 'There's always an interest in standardization, and XML is going to be a key technology for making systems interoperate'..." See: (1) "XACML 1.0 Specification Set Approved as an OASIS Standard"; (2) the announcement of 2003-02-18, "XACML Access Control Markup Language Ratified as OASIS Open Standard. Universal Language for Authorization Policy Enables Secure Web Services."; (3) general references in "Extensible Access Control Markup Language (XACML)."

  • [February 18, 2003] "Sun Hails XACML." By Paul Krill. In InfoWorld (February 18, 2003). "Sun Microsystems on Tuesday [announced] the release of an open-source implementation of XACML (eXtensible Access Control Markup Language) 1.0, billed as a standard mechanism for setting access controls in Web services and other applications... XACML is an OASIS specification for expressing policies in XML for information access over the Internet. Sun's implementation, developed within the company's Internet Security Research Group, is intended to enable use of the language in applications ranging from file servers to Web services and directories, according to Steve Hanna, senior staff engineer in Sun Labs, in Burlington, Mass. 'Anything that needs access control can adopt XACML as [its] access control policy language,' Hanna said. XACML is intended to replace proprietary access control mechanisms, Hanna said. The problem has been that every vendor has had its own custom way to specify access control, he said. 'That's a nightmare from an administrative standpoint,' said Hanna. Sun's XACML code can be integrated into products such as a file server or a Web services toolkit free of charge, he said. The company hopes to generate revenue off of XACML by including it in policy-driven computing initiatives such as Sun's N1 plan, Hanna said..." See details in the 2003-02-18 news story "Sun Microsystems Releases Open Source XACML Implementation for Access Control and Security." Related: (1) "XACML 1.0 Specification Set Approved as an OASIS Standard"; (2) the announcement of 2003-02-18, "XACML Access Control Markup Language Ratified as OASIS Open Standard. Universal Language for Authorization Policy Enables Secure Web Services."; (3) general references in "Extensible Access Control Markup Language (XACML)."

  • [February 17, 2003] "XMLCONF Configuration Protocol." Edited by Rob Enns (Juniper Networks). With contributions by Andy Bierman (Cisco Systems), Ken Crozier (Cisco Systems), Ted Goddard (Wind River), Eliot Lear (Cisco Systems), David Perkins (Riverstone Networks), Phil Shafer (Juniper Networks), Steve Waldbusser, and Margaret Wasserman (Wind River). IETF Network Working Group, Internet-Draft. Reference: 'draft-enns-xmlconf-spec-00'. February 12, 2003, expires August 13, 2003. 78 pages. "The XMLCONF protocol defines a simple mechanism through which a network device can be managed. Configuration data, state information, and system notifications can be retrieved. New configuration data can be uploaded and manipulated. The protocol allows the device to expose a full, formal, application programming interface (API). Applications can use this simple interface to send and receive full and partial configuration data sets. This mechanism uses a remote procedure call (RPC) paradigm to define a formal API for the device. A client encodes an RPC in XML and sends it to a server using secure, connection-oriented session. The server responds with a reply encoded in XML. The contents of both the request and the response are fully described in XML DTDs and/or XML Schemas, allowing both parties to be cognizant of the syntax constraints imposed on the exchange. One of the key aspects of XMLCONF is an attempt to allow the functionality of the API to closely mirror the native functionality of the device. This will reduce implementation costs and allow timely access to new features. In addition, applications will be able to access the syntactic and semantic content of the device's native user interface. XMLCONF allows a client to discover the set of protocol extensions supported by the server. These "capabilities" permit the client to adjust its behavior to take advantage of the features exposed by the device. The capability definitions can be easily extended in a non-centralized manner. Standard and vendor-specific capabilities can be defined with semantic and syntactic rigor. The XMLCONF protocol is a building block in a system of automated configuration. XML is the lingua franca of interchange, providing a flexible but fully specified encoding mechanism for hierarchical content. XMLCONF can be used in concert with XML-based transformation technologies like XSLT to provide a system for automated generation of full and partial configurations. The system can query one or more databases for data on networking topologies, links, policies, customers, and services. This data can be transformed from a vendor independent data schema into a form that is specific to the vendor, product, operating system, and software release using one or more XSLT scripts. The resulting data can be passed to the device using the XMLCONF protocol..." [cache]

  • [February 17, 2003] "Web Services Reliability Specification Moves to OASIS. Oracle, Sun, Others Back Effort." By Paul Krill. In InfoWorld (February 14, 2003). "The Web Services Reliability specification, announced by Sun Microsystems, Oracle, and Fujitsu in January, is being placed under the jurisdiction of OASIS, which will consider the proposal for promotion as an industry standard, according to Oracle and Sun officials on Friday. The specification is to be considered by the OASIS Web Services Reliable Messaging Technical Committee, said Jeff Mischansky, director of Web services standards at Oracle, in Redwood Shores, Calif. Submitting the proposal to OASIS for standardization 'seemed like the quickest way to get the work started,' he said. A final version of the specification is expected to be completed by September or October, Mischansky said. The WS-Reliability specification is intended to help accelerate adoption of Web services through linking of applications via standard interfaces. The specification features extensions of SOAP intended to guarantee Web services delivery, eliminate message duplication, and provide for message ordering. The specification is royalty-free, meaning no vendor can collect fees for use of its technology in the proposal..." See details in "OASIS Members Form Technical Committee for Web Services Reliable Messaging." General references in "Reliable Messaging."

  • [February 17, 2003] "E-learning Hits Web Services Books. E-learning Standards Take Shape." By [InfoWorld Staff]. In InfoWorld (February 15, 2003). "As e-learning platforms and content evolve toward open standards, the capability to surface learning seamlessly within the context of enterprise applications and business processes is almost within reach. This idea of just-in-time, contextual learning is driven by pure-play e-learning technology vendors, such as Saba, Docent, and Plateau Systems, as well as platform vendors IBM and Oracle. IBM last month unveiled a new LMS (learning management system) consisting of e-learning components designed to connect to other applications and learning products. The J2EE-based IBM Lotus Learning Management System runs on WebSphere and DB2 and supports Web services standards SOAP and XML. The intent is to leverage Web services to embed e-learning functionality into business applications such as CRM and ERP, according to Andy Sadler, director of e-learning at IBM Lotus Software in Cambridge, Mass. Integration flexibility is gathering steam as most e-learning technology vendors move away from proprietary systems toward open standards, either by architecting new systems from scratch or adding support for J2EE or Microsoft .Net... With the foundation of an open architecture in place, the door is opening to Web services -- and the related capability to surface e-learning as events within other applications. Oracle recently introduced a Web services version of its Oracle iLearning platform. Version 4.2 leverages Web services for integration and bolsters analytics capabilities between systems. A forthcoming update to the product due mid-year will increase linkages between the LMS and the Oracle E-Business Suite, specifically weaving learning management into customer service and project management applications, according to Bill Dwight, vice president of learning management applications at Oracle, in Redwood Shores, CA... In addition to platform standards, e-learning content standards are also being hashed out. SCORM (Sharable Content Object Reference Model), first developed by the Department of Defense, is a set of specifications that aim to enable interoperability and reusability of Web-based e-learning content. A draft of SCORM Version 1.3 is under development. Oracle is spearheading an effort to standardize Web services for e-learning. The IMS Global Learning Consortium aims to provide standard Web service APIs to learning management functions, according to Oracle's Dwight. The charter for this new standards effort will be up for ratification vote later this month... See: "Shareable Content Object Reference Model Initiative (SCORM)."" See general references in "XML and Educational Technologies."

  • [February 17, 2003] "Business Process Integration with IBM CrossWorlds. Part 3: Automatically Externalize Web Services with WebSphere Business Connection." By Rob Cutlip (Software and Solutions Architect, IBM). From IBM developerWorks, IBM developer solutions. January 2003. ['In this third and final installment in the series, Rob Cutlip describes how to enable Web services for the IBM CrossWorlds environment by using WebSphere Business Connection offerings, which allow business-to-business process integration and data sharing among trading partners. This article extends the Web services architecture described in Part 2 of the series, which in turn built upon the IBM CrossWorlds Business process integration infrastructure presented in Part 1. WebSphere Business Connection offerings provide scalable, B2B gateway function, and they exploit cutting-edge technologies, including WSIF, AXIS, and HTTPR; they support 'reliable messaging' over the Internet, which is the delivery of messages in their exact form to a destination, once and only once. Messaging agents, persistent storage, and HTTPR are provided to support reliable messaging.'] "WebSphere Business Connection offerings let an organization execute complex business process flows and connect to trading partners using a variety of standard protocols, including RosettaNet, EDI, EDI-INT (AS1/AS2), and ebXML, as well as Web services. Companies can start with the simple, low-cost Web services connection capabilities provided by the entry level Business Connection Express, and scale to support additional partners and more complex business processes using the WebSphere Business Connection standard and enterprise editions. Beyond the Web services gateway functions provided by all of the WebSphere Business Connection offerings, there are many mission-critical functions, including trading partner management and 'large file transfer' document exchange with guaranteed delivery using HTTPR. HTTPR is an extension of HTTP that supports reliable message exchange. The WebSphere Business Connection offerings are part of a larger set of business process integration offerings that comprise the WebSphere Business Integration (WBI) family..." See details in "Reliable HTTP (HTTPR)." General references in "Reliable Messaging."

  • [February 17, 2003] "SOAP Extension Soup." By Roger Jennings. In XML & Web Services Magazine (January 31, 2003). ['The race to establish new WS-Whatever standards and intellectual property issues threaten Web services interoperability.'] "Too many cooks spoil the broth is the aphorism that best describes the current state of proposed Web service extension 'standards.' Rival vendor groups compete for trade press coverage and Web service developer mind share. IBM, Microsoft, and their allies hold the high ground with thirteen (13) published specifications, while Sun, Oracle, and their contingents wage a guerilla battle with two proposed SOAP-related specs. W3C working groups and OASIS technical committees contend for the preeminent Web service standards body role. The Web Services Interoperability (WS-I) Organization intends to resolve conflicting standards implementations. Unresolved intellectual property (IP) issues pose the threat of multiple standards for related extensions and interoperability breakdown. This article examines current Web service extension proliferation issues from a .NET developer's standpoint... Latecomers to the Web services standards table hope to overcome the IBM-Microsoft SOAP extensions hegemony by publishing specs to fulfill real or imagined Web services needs. As an example, Fujitsu, Hitachi, NEC, Oracle, Sonic Software, and Sun released what appears to me to be a very early and incomplete working draft of their proposed Web Services Reliability (WS-Reliability) standard on January 9, 2003. WS-Reliability -- based on OASIS's ebXML Message Service (ebXMS) standard -- doesn't deal with Quality of Service (QoS) issues, so a more accurate title would be 'WS-Reliable Messaging.' WS-Reliability defines SOAP header extensions for guaranteed delivery, duplicate elimination, and message ordering in a non-routed (endpoint-to-endpoint) environment... Clemens Vasters and Werner Vogels have analyzed the many deficiencies and ambiguities of the WS-Reliability proposal. Gartner's January 13, 2003 note on WS-Reliability deals primarily with the politics of current Web services extension proposals. IBM's Bob Sutor damned WS-Reliability with faint praise in a statement to 'We hope that the WS-Reliability spec represents an attempt to further converge ebXML into the increasingly dominant industry web services effort.' My conclusion is that WS-Reliability's authors, especially Sun and Oracle, opted for press coverage of a premature 'standard' to combat the December 19, 2002 barrage from Microsoft and IBM. Perhaps, as Werner Vogels suggests, IBM and Microsoft can get together and integrate reliable messaging features within a variation on the current WS-Routing theme. Microsoft's disdain for ebXML and IBM's for WS-Routing are likely to be the primary impediments to a joint WS-Routing/Reliability proposal. In the meantime, WS-Routing remains a .NET-only technology. Microsoft's initial WS-Routing/WS-Referral implementation in the Web Services Enhancement (WSE) 1.0 update to the Web Services Development Kit (WSDK) might deliver the incentive for a cooperative standards effort..." On WS-Reliability work at OASIS, see OASIS Web Services Reliable Messaging TC (WS-RM). General references in "Reliable Messaging."

  • [February 17, 2003] "Get Small with Wireless Messaging and Mobile Media. Two New J2ME APIs Intend to Deliver on The Promise of Easier Software Development for Small Mobile Devices." By Daniel F. Savarese. In JavaPro Magazine (March 2003). "All the pieces of J2ME have yet to come together, but some of its latest developments promise to simplify wireless application development. Let's examine two of the latest J2ME APIs of interest to wireless Java developers: the Wireless Messaging API (WMA) and the Mobile Media API (MMA)... An attempt is being made through the Java Community Process (JCP) to clarify the relationship between optional packages, configurations, and profiles and how they relate to different classes of devices and market segments. In the area of wireless devices, the JSR-185 ('Java Technology for the Wireless Industry') expert group was formed in May 2002 to develop a clear architecture for wireless J2ME environments. Another expert group, JSR-68 ('J2ME Platform Specification'), is addressing a general problem that has emerged with J2ME APIs. J2ME Profiles often duplicate functionality already present in Java 2 Platform, Standard Edition (J2SE). Rather than reinvent this functionality, it is better to use existing J2SE APIs. As a result of JSR-68, the next version of the J2ME platform will include the concept of a 'J2ME Building Block.' Building blocks define APIs derived from J2SE or Java 2 Platform, Enterprise Edition (J2EE) for use in J2ME and make them available to be incorporated into J2ME Profiles in a reusable fashion. JSR-68 should keep J2ME from becoming an indecipherable mess of overlapping, but incompatible, APIs. We'll know more when the expert group publishes a public review draft of their specification. ... The benefits provided by the Wireless Messaging API and the Mobile Media API lie in the simplicity of their abstractions. If the JCP can deliver on JSR-185 and JSR-68 to impose more structure on the many elements of J2ME before things get out of hand, writing and maintaining software for a wide range of small mobile devices will be a breeze... "

  • [February 14, 2003] "Exposing Information Resources for E-Learning. Harvesting and Searching IMS Metadata Using the OAI Protocol for Metadata Harvesting and Z39.50." By Andy Powell (UKOLN, University of Bath) and Steven Richardson (UMIST). In Ariadne Issue 34 (January 14, 2003). "IMS is a global consortium that develops open specifications to support the delivery of e-learning through Learning Management Systems (LMS) -- or 'Virtual Learning Environment (VLE)' as used in the UK. IMS activities cover a broad range of areas including accessibility, competency definitions, content packaging, digital repositories, integration with 'enterprise' systems, learner information, metadata, question & test and simple sequencing. Of particular relevance to this article is the work of the IMS Digital Repositories Working Group (DRWG). The DRWG is working to define a set of interfaces to repositories (databases) of learning objects and/or information resources in order to support resource discovery from within an LMS. In particular, the specifications currently define mechanisms that support distributed searching of remote repositories, harvesting metadata from repositories, depositing content with repositories and delivery of content from the repository to remote systems. Future versions of the specifications will also consider alerting mechanisms, for discovering new resources that have been added to repositories. Note that, at the time of writing, the DRWG specifications are in draft form. Two broad classes of repository are considered: (1) Native learning object repositories containing learning objects; (2) Information repositories containing information resources (documents, images, videos, sounds, datasets, etc.). In the former, it is assumed that, typically, the learning objects are described using the IMS metadata specification and packaged using the IMS content packaging specification. The latter includes many existing sources of information including library OPACs, bibliographic databases and museum catalogues where metadata schemas other than IMS are in use. In both cases it is assumed that the repository may hold both assets and metadata or just metadata only. Both the example implementations described below fall into the second category of repository. The DRWG specifications describe the use of XQuery over SOAP to query 'native' repositories of learning objects. This usage is not discussed any further in this article. The specifications also describe how to search and harvest IMS metadata from 'information' repositories using the OAI Protocol for Metadata Harvesting (OAI-PMH) and Z39.50..." See: (1) "IMS Metadata Specification"; (2) "Open Archives Metadata Set (OAMS)."

  • [February 14, 2003] "XML Database Indexes Unstructured Data." By Lisa Vaas. In eWEEK (February 10, 2003). "XML database maker Ixiasoft Inc. is coming out with a new version of its TextML Server, an XML content server whose purpose is to store, index and retrieve XML content... TextML Server 2.3, to be released this month, adds easier access to XML Namespaces, which serve as indexes to XML indexes. Essentially, the support provides a simple method of qualifying element and attribute names used in XML documents by associating them with namespaces identified by URL references. The update adds support for WebDAV (Web-based Distributed Authoring and Versioning), a set of extensions to the HTTP protocol that facilitates exchange of documents over the Web and that allows for collaborative authoring. The upgrade brings support for Adobe Systems Inc.'s XMP (Extensible Metadata Platform), which enables XML metadata to be embedded within otherwise unsearchable application files, such as PDFs, scanned images or handwritten forms. All this XML is good for the U.S. Air Force, which recently extended its license of TextML Server, thereby relieving some members of the armed forces of the need to lug around maintenance manuals weighing hundreds of pounds. Such manuals are the kind of unstructured data that is more easily searched in an XML database than in standard relational databases, said Ixiasoft officials, in Montréal..."

  • [February 14, 2003] "Cost Would Mute the Effect of XML Standards for Emergency Workers." By Rich Mogull (Gartner Research). Gartner FirstTake. 12-February-2003. Reference: FT-19-4330. 2 pages. Comments on the OASIS Emergency Management Technical Committee. "Although establishing XML standards would facilitate the exchange of information between many disparate agencies, the wide scope of the effort probably will make it impractical. Even if the committee succeeds in establishing standards, it will address only part of the problem. The real hindrance to communication between government agencies is economic pressure. Cash-strapped local governments run on very limited IT budgets. The financial crisis among state governments has exacerbated the problem. Federal homeland security funding may spur some interoperability projects, but large IT projects at the local level will be limited. For now, operations have become more of an immediate problem than interoperability. In addition, complete interoperability is somewhat unworkable. Emergency systems do not interoperate for a number of reasons -- primarily, limited network connectivity. Such systems frequently carry information of a sensitive or private nature, and therefore must run on secure private networks. These networks should not depend on, or even frequently connect with, the Internet. If they did, they could face malicious cyberattacks or inadvertent risk through the negligence of users. With vendors struggling with the same tight IT market, they may use the ideal of interoperability as a way to penetrate the emergency services market. Many hope to tap into well-funded homeland security initiatives. However, the OASIS initiative will not afford them any short-term gains and may have only a limited, long-term impact. Enterprises and vendors should monitor the Emergency Management Technical Committee's efforts and evaluate the specifications that it develops -- if any. Homeland security technology managers working on interoperability projects should not delay work while waiting for the development of specifications..." [HTML version]

  • [February 13, 2003] "Content Assembly Mechanism (CAM) Specifications and Description Document." Edited by David Webber. Draft Version 1.0. Produced by the OASIS Content Assembly Mechanism Technical Committee. 69 pages. The source ZIP archive provides a draft XML DTD and XML Schema, as well as a sample instance. From the Introduction: "Content assembly has been solved in a variety of ways in the past. Particularly the traditional electronic data interchange (EDI) approach is to rigorously restrict content variance so as to avoid the need for dynamic definitions in software. This proved to be both the strength and weakness of EDI, and therefore for specific business scenarios EDI itself resorted to the use of written implementation guidelines to formalize the interchange details. With the advent of XML based transaction content implementers have learned that while constructing schema structure definitions provides a higher degree of flexibility for business scenarios than EDI nevertheless the same limitations on interoperability recur while there is no robust means to specify business scenario details for actual schema use. OASIS itself has found that its technical teams developing industry vocabularies cannot fully derive the needed depth of detail on the use of such vocabularies while making use of schema alone. Further more the notion of producing business re-usable information components (BRICs ) both within and across OASIS industry vocabularies has been problematic in XML (especially without robust inclusion and versioning mechanisms). Clearly the urgent business need is to move beyond this and provide a machine-readable format in XML that can then allow business application software to automatically configure the interchanges according to the business rules The core role of the OASIS CAM specifications is therefore to provide a generic standalone content assembly mechanism that extends beyond the basic structural definition features in XML and schema to provide a comprehensive system with which to define dynamic e-business interoperability. In addition the CAM specifications are providing support and collaboration tools to existing OASIS technical work by linking together key components of the overall e-business systems architecture In the context of e-business collaboration the problem fundamentally stems from the need for each partner to be able to both quickly adopt and start using standard industry building blocks and interchanges, while at the same time being able to overlay onto this their own local business context and special needs, (such as product specific information or country or locale specific information)..."

  • [February 12, 2003] "XML at Five." By Edd Dumbill. From (February 12, 2003). "The XML 1.0 Recommendation turned five years old this Monday, 10 February. Since February 1998 XML has grown and developed, vastly exceeding all initial expectations. During those five years, has followed the XML world tirelessly, publishing nearly a thousand articles and becoming a forum for the heart of the XML community itself. To celebrate this auspicious anniversary, I asked some XML old-hands and friends of to comment on their experience with XML over the last five years... As XML marches on, what would you most like to see? Ken Holman wants us to be one happy family, wishing for 'community harmony.' Michael Sperberg-McQueen also sees this as important. 'I'd like the community of users to stay together and avoid forking. That means the minimum defined standard has to be powerful enough to support what most people need to get done, even if if means having some things there that seem unnecessary for some specialized applications. It also means those with more stringent demands being willing for some functionality to be supplied by standardized applications, rather than by the core language. For a language intended for open data exchange and interoperability, it is more important that there be one language with as few optional features as possible than that the details of the language turn out in any particular way.' Rick Jelliffe had some markup-oriented wishes to share. 'I'd also like to see so-called microparsing capabilities increase, so that structures can be written using their natural notation where they have one. W3C XSLT2 and ISO DSDL look like they will improve life for this. Oh, and I would like a thorough review of XML for robustness and reliability, and soon. The idea that the class of parser should determine the information to be presented to an application is incoherent. The question should not be 'how do we simplify XML?' but 'how do we make it more robust and reliable?' Ends before means; Viagra not circumcision!' Henry Thompson would like to give XML users better access to their data, wanting to see 'declarative data-binding, so that the choice between manipulating XML as XML and working directly with the data it (sometimes) encodes is not constrained by irrelevant overheads.' Ever optimistic, Simon St.Laurent wishes for a period of calm reflection. ' I'd actually like to see a period of quiet -- less development -- on the specification side, and more effort put into figuring out what it is we already have. Computing isn't exactly a contemplative field, but too many people seem to be racing forward painting 'XML' on as much as possible without even wondering what that might mean'..." See similarly: (1) the reflections in W3C's "Happy Birthday, XML!", (2) Jon Bosak's "Happy Birthday, XML!" Message, and (3) a local collection of retrospects, "Happy Birthday XML."

  • [February 12, 2003] "Building Metadata Applications with RDF." By Bob DuCharme. From (February 12, 2003). ['The Python RDFlib library gives Bob DuCharme a "lightbulb moment" with RDF.'] "The real test of any technology's value is what kinds of tasks are easier with it than without it. If I hear about some new technology, I'm not going to learn it and use it unless it saves me some trouble. Well, being a bit of a geek, I might play with it a bit, but I'm going to lose interest if I don't eventually see tangible proof that it either makes new things possible or old things easier. I've played with RDF for a while and found some parts interesting and other parts a mess. During this time, I continued to wonder what tasks would be easier with RDF than without it. I came across some answers (for example, various xml-dev postings or some documentation on the Mozilla project), but they usually addressed the issue in fairly abstract terms. The first time I tried the RDFLib Python libraries, the lightbulb finally flashed on. RDFLib lets you generate, store, and query RDF triples without requiring you to ever deal directly with the dreaded RDF/XML syntax. And you can do all this with a minimal knowledge of Python... RDF's ability to assign attribute name-value pairs to anything that can be identified with a URI always appealed to me, but the mechanics of generating, reading, and using the subject-predicate-object triples seemed like too much trouble. Now that I've found a tool that makes it easy, I have a new perspective on RDF. And RDFLib isn't the only tool that lets you do this. A variety of tools are available for processing RDF using different languages: The Repat C library, Hewlett-Packard's Jena and David Megginson's RDF Filter for Java, and 4Suite, in addition to RDFLib, for Python. And many people don't know about the exposed RDF support built right into Mozilla! Conversations about the value of RDF often veer off on two distracting detours: debates about the architecture and syntax of RDF/XML and debates about the potential value of the Semantic Web. If you ignore the latter and let tools like RDFLib shield you from the less appealing details of the former, RDF's value becomes much more readily apparent, and its increasing success in the metadata community starts to make more sense. I look forward to more work with it..." See: (1) RDF at W3C; (2) local references in "Resource Description Framework (RDF)."

  • [February 12, 2003] "Simple XML Processing With elementtree." By Uche Ogbuji. From (February 12, 2003). ['Uche Ogbuji introduces elementtree, a pythonic way of processing XML.'] "Fredrik Lundh, well known in Python circles as "the effbot", has been an important contributor to Python and to PyXML. He has also developed a variety of useful tools, many of which involve Python and XML. One of these is elementtree, a collection of lightweight utilities for XML processing. elementtree is centered around a data structure for representing XML. As its name implies, this data structure is a hierarchy of objects, each of which represents an XML element. The focus is squarely on elements: there is no zoo of node types. Element objects themselves act as Python dictionaries of the XML attributes and Python lists of the element children. Text content is represented as simple data members on element instances. elementtree is about as pythonic as it gets, offering a fresh perspective on Python-XML processing, especially after the DOM explorations of my previous columns... elementtree is very easy to set up. I downloaded version 1.1b3 and you can always find the latest version on the effbot download page. You need Python 2.1 or newer; I used 2.2.1... elementtree is fast, pythonic and very simple to use. It is very handy when all you want to do is get in, do some rapid and simple XML processing, and get out. It also includes some handy tools for HTML processing. The module elementtree.TidyTools provides a wrapper for the popular HTML Tidy utility, which, among other things, can take all sorts of poorly structured HTML and convert it into valid XHTML. This makes possible the elementtree.TidyXMLTreeBuilder module, which can parse HTML and return an elementree instance of the resulting XHTML... See also: (1) Python and XML, by Christopher A. Jones, Fred L. Drake, Jr.; (2) general references in "XML and Python."

  • [February 12, 2003] "Is There a Consensus Web Services Stack?" By Kendall Grant Clark. From (February 12, 2003). ['Kendall Clark examines recent debate as to whether the "web services stack" is a thing of fact or fiction, and also muses on the latest news in relation to web services patents.'] "The web services 'stack' is a semimythical creature, composed in equal parts of technical specification, running code, and marketing campaign. Since the global IT budgetary funk is unlikely to brighten during 2003 (or even 2004), the proliferation of web services specifications and technologies is as much wishful thinking as it is anything else. It is fairly uncontroversial that the core of the web services stack rests on the Internet infrastructure (TCP/IP, HTTP, and, arguably, URIs), followed by the XML layer, the SOAP layer, and the WSDL layer. Even though the composition of this segment of the web services stack reflects a rough industry consensus, that doesn't mean it necessarily works. Several people have reported interoperability problems in the SOAP-WSDL layers. As Don Box said, 'many interop problems are a result of various stacks...trying to mask the existence of XML altogether. That combined with a WSDL specification (1.1) that invented far more than it needed to has made life harder than would otherwise be necessary.' In short, even with a rough consensus, interoperation can be messy across different implementations of the web services stack; this is the case, in part, because people still aren't sure whether XML is a data model, just a syntax, or both. And so various groups try to bury XML beneath WSDL layers of, as Don Box said, 'goop'. After that, things get even trickier. Where does UDDI go? And, more to the point, what are its chances of success? Does the world really need -- to use Don Box's pithy explanation of UDDI, an 'RDF-esque version of LDAP over SOAP'? As Paul Brown put it, ''free love' on the business internet is a long way off'. Microsoft, for one, has a rather wild and woolly welter of high-level elements in the web services stack: coordination, inspection, referral, routing, policy, transaction. What about the various, competing service coordination specifications: BPEL, BTP, WSCI, and on it goes. Again, Paul Brown said it best, 'The answer to the question 'Is there a royalty-free specification of web services choreography that legitimately combines and acknowledges the established body of knowledge on long-running transaction models, business process semantics, and cross-domain implementation realities?' is 'No.'..."

  • [February 12, 2003] "XML-Based Programming Systems." By Gregory V. Wilson [WWW] (Baltimore Technologies). In Dr. Dobb's Journal #346 Volume 28, Issue 3 (March 2003), pages 16-24. Special Issue on XML Development, edited by Jonathan Erickson. ['Will mixing XML and source code revolutionize programming in the coming years?'] "JSP, Ant, and JavaDoc share two basic weaknesses: they are hard to read, and there is a representation gap between what you type in and what you have to debug... Sooner or later, programmers will solve both problems by abandoning flat text and storing programs as XML documents... Do you really believe that ours [the programmers'] documents will be the only documents that aren't marked up and can't be manipulated using generic tools? ..." With program listings. See also "XML Markup Languages for User Interface Definition."

  • [February 12, 2003] "XML Data Binding." By Eldon Metz and Allen Brookes. In Dr. Dobb's Journal #346 Volume 28, Issue 3 (March 2003), pages 26-36. Special Issue on XML Development, edited by Jonathan Erickson. ['XML data binding utilities dramatically simplify the task of writing XML-enabled applications by automatically creating a data binding for you.'] "An XML binding is programming language code that represents XML data, thereby ensuring that the documents conform to their schema. The generated code enables the transfer of XML data to/from instances of the generated classes. While XML data-binding tools may not always be useful when writing code to process XML, they usually do save time in coding, testing, and maintenance..." See also Ronald Bourett's "XML Data Binding Resources" and the program listings. General references in "XML and Databases."

  • [February 12, 2003] "SVG and Smart Maps." By Keith Bugg (Knoxville Utilities Board Geographic Information System - KGIS). In Dr. Dobb's Journal #346 Volume 28, Issue 3 (March 2003), pages 38-41. Special Issue on XML Development, edited by Jonathan Erickson. ['Scalable Vector Graphics is a plain-text format that can make graphics look "flashy." Keith shows how you can use SVG to create smart maps that can be dynamically updated, animated, and more.'] "In this article I examine how SVG can be used to create 'smart maps' that, unlike static JPEG images, can be dynamically updated, animated, and even searched... Wouldn't it be convenient to know about road delays that were due to construction, accidents, inclement weather, and the like [projected onto a map while planning a trip or driving...] In many cases, the data necessary for creating SVG maps is readily available. For instance, raw geographic data is available at [USGS Geographic Data Download]... You can also work with aerial photographs..."See also the listings and source code. General references in "W3C Scalable Vector Graphics (SVG)."

  • [February 12, 2003] "J2ME and Embedded Systems." By William Wright (BBN Technologies). In Dr. Dobb's Journal #346 Volume 28, Issue 3 (March 2003), pages 54-58. Special Issue on XML Development, edited by Jonathan Erickson. ['William uses J2ME with aJile Systems' aJ-100 processor to build a network security device.'] "The Java 2 Micro Edition (J2ME) is a flexible tool for development of environments where resources don't support the full Java 2 Standard Edition. In this article I use J2ME, some common security sensors, and a small single board computer that runs J2ME to implement a network appliance that monitors and reports on activity in an area..." See also listings and source code.

  • [February 12, 2003] "WS-I Sets Basic Profile for Interoperable Web Services." By Vance McCarthy and Rob Cheng. In Integration Developer News (February 10, 2003). "The Web Services Interoperability Organization's 48-page Basic Profile draft document, edited by WS-I members from Microsoft, IBM, BEA Systems, and webMethods, is designed to present an exhaustive list of implementation areas In specific, the WS-I Basic Profile aims to spell out a set of open (non-proprietary) web services specifications for how developers and vendors should implement SOAP, WSDL and other key first-generation web services technologies to insure interoperability between different technologies (such as Java and .Net, Java and Windows). The Basic Profile's implementation guidelines recommend how SOAP 1.1, WSDL 1.1, UDDI 2.0, XML 1.0 and XML Schema should be used together to develop interoperable Web services, drilling down into the following implementation areas: (1) Messaging -- The exchange of key SOAP elements, including the SOAP processing model and XML representations of SOAP messages; (2) Service Description -- How web services and/or objects are prepared and transmitted (Document Structure, Types, Messages, Port Types, Bindings, etc.) (3) Discovery: metadata that enables the advertisement of a Web service's capabilities; (4) Security -- The provision of integrity, privacy, authentication and authorization (HTTPS, Certificate Authorities, Encryption Algorithms)... 'WS-I is looking to provide a single point of aggregation or contact for all interoperability activity,' said Rob Cheng, chairman of WS-I's marketing communications committee. 'Without this organization, you would actually have to have a large amount of your staff and resources focused on all the different [standards] activities, and then try to figure out how they all work together. To get more insight on the role WS-I hopes to play for developers and implementers of web services, Integration Developer News spoke in-depth with Cheng..." [Q: 'What's next up for WS-I? Any further add-ons to the current Basic Profile in the works?'] A: (Cheng) "We will have a relatively short term around step that will handle SOAP with attachments. This is something that has been driven by the membership, and something that has been discussed by the end user companies and the smaller vendors to have attachment support. The SOAP with attachments be handled in an updated Basic Profile (v1.1), and implementation issues for SOAP 1.2 will come later this year..." See also: "WS-I Publishes Supply Chain Management Candidate Review Drafts"; (2) general references in "Web Services Interoperability Organization (WS-I)."

  • [February 12, 2003] "IRS Regroups On Business Filing." By Diane Frank. In Federal Computer Week (February 12, 2003). From the Web-Enabling Government Conference. "The Internal Revenue Service's e-government initiative to provide online tax filing for businesses has become so much more complex than leaders expected that the agency is writing a new business case to support the initiative's needs. The Expanding Electronic Tax Products for Businesses initiative's biggest project is enabling companies to file their corporate income tax form 1120 electronically. While the initiative has progressed on many fronts, it will have to take a step back as it overlaps with and moves into the overall IRS business systems modernization effort, said Mary Ellen Corridore, project manager for the initiative... For the 1120s, the IRS has focused on using Extensible Markup Language to make it easier to file multiple returns and the many attachments that are often included in the returns. The tax-specific XML forms developed also will help to validate a lot of the information automatically, saving time for IRS employees, Corridore said. However, the IRS' electronic filing infrastructure cannot yet support these large returns, she said. Millions of individuals are already filing their income tax forms online, but the corporate forms can be up to 36,000 pages just for the form response, she said... The initiative also expects to launch another project within the next month. The Internet Employer Identification Number (EIN) project will allow companies to apply for and receive their EINs online, cutting down on time and work at both ends of the process, Corridore said. Right now, companies must apply by fax or mail. On average, the fax process will take four days, and mail will take 10. By moving the process to the Web, it will take only five seconds, she said..." See: (1) "US Internal Revenue Service and SGML/XML for Tax Filing"; (2) general references in "XML Markup Languages for Tax Information."

  • [February 11, 2003] "BEA Adding Web Services to Tuxedo. Transaction Processing Software Being Upgraded." By Paul Krill. In InfoWorld (February 10, 2003). "BEA Systems [has announced] a version of the Tuxedo transaction processing software that leverages Web services and is linked more closely to other BEA products... Version 8.1 of Tuxedo is a Web services-enabled release of the software, which began as an AT&T product before being passed to Novell and finally to BEA in 1996. With Version 8.1, services or functionality within Tuxedo can be extended as Web services, said George Gould, director of Tuxedo product marketing for BEA, in San Francisco . 'In fact, Tuxedo was a services-oriented architecture well before the Web,' Gould said. To enable Web services deployment, BEA is building connectors between Tuxedo and the WebLogic Server application server and other BEA products, including the WebLogic Workshop development environment. The connector technology is featured in Version 8.1 of Tuxedo, to ship next week, and will be featured in upcoming WebLogic product releases due during the next six months. When the products are upgraded with the connector technology, developers will be able to deploy existing Tuxedo applications as Web services using the Tuxedo control for Workshop, according to BEA. The company is calling its connectivity technology WebLogic Tuxedo Connector..." See details in the announcement: "BEA Announces Web Services Release of BEA Tuxedo. Integration with BEA Platform Extends Tuxedo's Service-Oriented Architecture to the Web and Leverages Best of J2EE."

  • [February 11, 2003] "HTML to Formatting Objects (FO) Conversion Guide. Use These XSLT Templates to Speed Your Conversions of HTML Elements to FO and Thence to PDF." By Doug Tidwell (developerWorks' Cyber Evangelist, IBM developerWorks). From IBM developerWorks, XML zone. February 2003. ['Need help converting HTML documents to PDF? This reference guide shows by example how to use XSLT templates to convert 45 commonly used HTML elements to formatting objects from the XSL-FO vocabulary for easy transformation to PDF using XSLT. The examples assume that you're using the Java-based XSLT processor Xalan and the Apache XML Project's FOP tool, but most of the methods would work just as well with other tools.'] "We all design our HTML pages to look good on the screen, but printing those Web pages is usually an afterthought. To create printable versions of Web pages, the best approach is to use XSLT and XSL-FO to generate a PDF file. You can do the job with an open-source XSLT processor, the XSL Formatting Objects (XSL-FO) vocabulary, and a formatting-object engine. If you already know how to work with XSL-FO and XSLT, this guide provides a valuable resource: It goes through the most common HTML tags and defines how to convert each of them into formatting objects... This guide includes dozens of examples that illustrate how to write XSLT style sheets to do the conversion from HTML element to the corresponding formatting object, the basic building block of documents rendered with XSL-FO..." See related resources in "Extensible Stylesheet Language (XSL/XSLT)."

  • [February 11, 2003] "Business Layers, Netegrity Spur SPML. SPML Demo Targeted for RSA Conference." By Brian Fonseca. In InfoWorld (February 11, 2003). "In an effort to allow disparate provisioning systems to integrate and communicate with each other through an XML interface as service-oriented architectures become more prevalent, Business Layers and Netegrity on Tuesday announced the creation of an identity management solution to support the Service Provisioning Markup Language (SPML) standard. SPML is designed to offer organizations a common XML-based framework to exchange encrypted and authenticated requests and information between 'Provisioning Service Points' of access areas such as corporate systems and data. Coincidentally, the OASIS Provisioning Services Technical Committee, the standards body group responsible for defining SPML, is scheduled to meet on Tuesday to finalize the standard, sources say..." See: (1) the announcement "Netegrity and Business Layers to Demonstrate Support for Service Provisioning Markup Language (SPML). Vendors Are First to Exhibit XML Based Solution for Identity Management."; (2) general references in "XML-Based Provisioning Services."

  • [February 10, 2003] "Netegrity Releases SAML Agent. Software Secures User ID." By Paul Roberts. In InfoWorld (February 10, 2003). "A new software application by Netegrity is intended to make it easier for organizations to securely exchange user-identity and sign-on information using SAML (Security Assertion Markup Language)... Netegrity's SiteMinder version 5.5 access management product already supports the SAML specification adopted by the Organization for the Advancement of Structured Information Standards (OASIS) in November 2002, enabling SiteMinder customers to create an SAML-based identity and share it with a partner Web site, according to Netegrity. With the new agent, partner sites that are not using SiteMinder will more easily be able to recognize and authenticate that SAML identity from sites that are, Netegrity said... The SAML Affiliate Agent also manages critical sign-off as users move from partner site to partner site, terminating sessions that might otherwise be left open after a user clicks away from a Web site or logs off... Also on Monday, Oblix said that it had integrated the SAML standard in its just-released NetPoint version 6.1 product. That product will allow companies using NetPoint to exchange SAML assertions with security systems at other partner sites, providing a single sign-on to resources on those sites..." See details in the Netegrity and Oblix announcements: (1) "Netegrity Delivers SAML Affiliate Agent to Lower Cost and Complexity of Federated Security Across Partner Sites. Enables Companies to Securely Partner in Order to Provide Enhanced Services to Users." (2) "Oblix Delivers Comprehensive SAML Integration to Provide Advanced Identity Management and Web Access Control. Oblix NetPoint 6.1 Enables Seamless User Authentication and Authorization Across Corporate Extranets Via Secure XML-Based Web Services Standard." General references in "Security Assertion Markup Language (SAML)."

  • [February 10, 2003] ".Net Patent Could Stifle Standards Effort." By Lisa M. Bowman. In CNET (February 10, 2003). "Microsoft is in the process of applying for a wide-ranging patent that covers a variety of functions related to its .Net initiative. If approved as is, the patent would cover application programming interfaces (APIs) that allow actions related to accessing the network, handling Extensible Markup Language (XML), and managing data from multiple sources. APIs are hooks in software that allow applications to work with another system. Microsoft declined to elaborate on its plans for the patent, but intellectual property attorneys said that if it's granted, the company could dictate how, or whether, developers of software and devices can link to the .Net initiative... The patent is one of several that Microsoft is applying for related to .Net, the company's Web services initiative..." See US patent application #20030028685, 'Application program interface for network software platform', filed February 28, 2002. Its abstract: "An application program interface (API) provides a set of functions for application developers who build Web applications on Microsoft Corporation's .NET(TM) platform."

  • [February 10, 2003] "Making Way for 3G Offerings." By Carmen Nobel. In eWEEK (February 10, 2003). "As U.S. wireless users continue their slow but steady march toward 3G wireless acceptance, several companies are working on making such services easier to deploy in terms of billing and applications. Companies such as Hewlett-Packard Co., Openwave Systems Inc., IPWireless Inc. and Emblaze Semiconductor Ltd. are trying to take advantage of third-generation network expansion, which has taken place over the past year, by simplifying deployment of complicated applications on the client and the server. HP this week will announce HP MSDP (Mobile Service Delivery Platform), a set of new and existing software, hardware and integration services designed to let carriers try out new services without having to devote massive management resources to each one -- thus keeping customer costs down... On the carrier side, MSDP will eliminate the need to customize a network for each service it deploys, officials said. 'People have been setting up whole subsystems to try something out, and if it doesn't work, they have to tear it down and try something else,' said Maurice Marks, chief technology officer of HP's network and service provider business, in Plano, Texas. 'We've pulled together a number of technologies to provide a framework so they can mix and match without having to take big risks.' The MSDP platform is based on standard Web services protocols such as Simple Object Access Protocol, Universal Description, Discovery and Integration, and Web Services Description Language and is designed to work with Java 2 Enterprise Edition and .Net environments, officials said. It incorporates HP's OpenCall Service Controller, OpenView Web Services Management Engine, and HP Mobile Portal Solution, in addition to new software components. It also includes an application server from BEA Systems Inc. HP officials said several major carriers are testing MSDP; the company will show the platform at 3GSM World Congress, a key wireless industry trade show in Cannes, France, next week..."

  • [February 10, 2003] "Putting Liberty to Work. Sun ONE Identity Server 6.0 Builds Powerful Cross-Domain Authentication on LibertyAlliance Specification." Review from InfoWorld Test Center (February 7, 2003). ['This identity management platform enables businesses to build commercial alliances with complementary partners that require customers to authenticate only once, without requiring that all identity management information be stored in a single data repository. IT administrators can configure and manage user and application authentication across domains and operating platforms from a central console that includes flexible policy-based management tools and monitoring services.'] "Sun released last month Sun ONE Identity Server 6.0, a system that manages not only the identity and authentication mechanism of users on large, disparate enterprise networks, but also addresses end-user log-in headaches via federated identity management based on a specification released by the Liberty Alliance Project last July. This means that Identity Server can be configured in two basic ways: First as an authentication service for use on large, heterogeneous corporate networks, and second as a Liberty-enabled federation management service. In corporate mode, users or applications attempting to access resources anywhere on the network must first pass through Identity Server's Authentication Service, typically via a log-in Web page, although this can be routed towards a custom GUI interface via additional programming tools. Once the user has provided the required information, the Authentication Service either grants or denies access. Although this sounds similar to what we already have, Identity Server can manage access across domains and operating systems, as well as many existing authentication systems and directory services. The Liberty-enabled configuration is intended to allow Web users to sign in to a Web site or Internet resource that is part of a Liberty authentication domain -- basically a conglomerate of resources operating in a trusted environment, all managed by the Liberty Alliance's federated authentication service. Thereafter, that user can roam to any Web site within that authentication domain and access resources without having to be re-authenticated. What's nice about the Liberty implementation is that it's cross-platform via Java and XML, and it doesn't require user authentication information to be stored in a central repository. Thus, a user could have basic username and password information stored on one server while having credit card information stored on another, yet still allow another application within the authentication domain to access both sets of data when needed. This means no single entity will have control over all user information, and no impediments to businesses retaining the information they need for effective customer relationship management..." See other details in "Sun ONE Identity Server 6.0 Supports Liberty Alliance and SAML Specifications."

  • [February 10, 2003] "Microsoft Christens XDocs Application 'InfoPath'. Application Aims to Help End-Users Edit Forms Using XML." By James Niccolai. In InfoWorld (February 10, 2003). "Microsoft on Monday announced the official name for the next member of its Office family, formerly code-named XDocs, and said a standards body for the healthcare industry is advocating use of the application as a way to help streamline medical records systems. Now officially called InfoPath, the application aims to make it easy for end-users to edit forms using the XML programming language. The forms can be used to extract and send business data to and from business applications running on back-end systems, and can help cut down on paperwork and reduce errors associated with manual data entry, according to Microsoft... According to Bobby Moore, a product manager for InfoPath at Microsoft, an InfoPath form tailored to the needs of a doctor might include fields with a patient's name, address, and medical history. When the doctor writes the patient's name in the form, other fields can be populated automatically using information pulled from back-end systems and delivered to the application in XML, Moore said. The idea is to cut down on the time it takes to fill out such forms and reduce the likelihood of error when information is entered manually. The doctor can save the form in the XML format automatically, and clicking a button sends the information back out to medical records systems, where it updates those systems and makes the information available for use across the organization. Different vertical markets, such as healthcare, manufacturing, and finance, have developed versions of XML, known as XML "schema," that are specific to their industries. At Monday's conference -- the Healthcare Information and Management Systems Society's 2003 Conference and Exhibition -- officials from Health Level 7 will show how InfoPath can be used with the Clinical Document Architecture (CDA), an XML standard used by their industry, Moore said..." See: (1) the announcement "Microsoft InfoPath (Formerly "XDocs") Supports Healthcare XML Data Standard. Microsoft's Information-Gathering Application to Be Featured in Upcoming Solution by Amicore."; (2) Microsoft InfoPath website; (3) "Microsoft Office 11 and InfoPath."

  • [February 07, 2003] "Managing Enumerations in W3C XML Schemas." By Anthony Coates. From (February 05, 2003). ['As experience with W3C XML Schema continues to increase, so does the body of "best practice" knowledge. Tony Coates explains the best way to handle enumerations in schemas. Enumerations are defined lists of terms used in a schema -- currency symbols, for example -- and they may or may not be under the schema author's control. Tony devises a strategy to ensure that such dependencies don't adversely affect your schemas.'] "When working with data-oriented XML, there is often a requirement to handle 'controlled vocabularies', otherwise known as enumerated values. Consider the following example of a bank account summary. [...] The problem in designing this schema is that the ISO 3-letter currency codes are externally controlled. They can change at any time. If you embed them in your schema, you need to reissue the schema every time ISO makes a change, which can be expensive. This is especially true in enterprise situations where any schema change, no matter how small, can require full retesting of any applications that use the schema. This needs to be avoided whenever possible... In this article, we will discuss how controlled vocabularies can be managed when using W3C XML Schemas, since this is the dominant XML schema format for data-oriented XML. Note that the 'vocabularies' we refer to are enumerated lists of element-attribute values. This differs from other contexts where 'vocabularies' are sets of XML element names... When dealing with controlled vocabularies (enumerations) in schemas, it is a good idea to rate the volatility of each vocabulary. A volatile vocabulary is one which is expected to change independently of the normal release cycle of schema versions. A stable vocabulary is one which is expected to change (if at all) only as new schema versions are released. Volatile vocabularies are a problem if embedded in a schema because they impose extra releases on all dependent applications... WXS schemas can be made more manageable by separating volatile controlled vocabularies (enumerations) into their own vocabulary schemas. In this article, we have seen how to identify volatile controlled vocabularies, how to separate them from the main schema, how to decouple the versions, and how to validate vocabulary schemas. There is no absolute rule for when a controlled vocabulary should have its own schema. Use the guidelines here, but always use your own judgment and your knowledge of your problem domain..."

  • [February 07, 2003] "BrownSauce: An RDF Browser." By Damian Steer. From (February 05, 2003). ['One of RDF's problems, as Damian Steer points, is that it's rather easier to produce than it is to consume. So Damian set out to produce the nearest thing he could to a generalized RDF browser that would make a decent job out of displaying arbitrary RDF information.'] "BrownSauce [a SourceForge Project] is an RDF browser. It attempts, armed with no more than a knowledge of RDF and RDF Schema, to present all RDF data as intelligibly as possible. RDF is biased in favor of the data producer. Consumers may have to deal with all, some, or none of the expected properties or classes, and they may have to be aware that entirely unknown properties and classes are possible and legitimate. BrownSauce is an attempt to deal with all that is thrown at it... BrownSauce attempts to improve on the triples approach. The problem is that such a display is too fine grained, but it has advantages: it will work with large documents or even sources where no single document is available (e.g., databases). So how to can an application find the obvious patterns in RDF data? RDF, unlike XML, has no mechanisms for expressing data structure; indeed, it is a semi-structured data format, so such information would only be a partial help. Having said that, the reader may be aware of RDF Schema. Don't be fooled by the name: RDF Schemas describe properties and classes, but cannot state that 'Houses have addresses'... When one looks at RDF data it is apparent that there are regular patterns. These are captured in BrownSauce using a simple rule: start at a node and work outward, passing over blank nodes... Currently BrownSauce can only browse documents. From the outset, however, the plan was to extend browsing to other sources such as databases with web interfaces. The code is in place and may well be added in the near future..." On RDF, see: (1) W3C RDF website; local references in "Resource Description Framework (RDF)."

  • [February 07, 2003] "XSLT, Browsers, and JavaScript." By Bob DuCharme. From (February 05, 2003). ['The regular monthly dose of XSLT in Bob DuCharme's "Transforming XML" column. Bob shows us how to integrate that most tricky of beasts, JavaScript, into the result of XSLT transformations.'] "Most XSLT processors offer some way to tell them: 'here is the source document and here is the stylesheet to use when processing it.' For a command-line XSLT processor, the document and stylesheet are usually two different parameters specified at the command line. Web browsers, however, usually read a document from a web server and have no way to separately be told about the stylesheet to apply. To remedy this, the W3C Recommendation Associating Style Sheets with XML Documents describes a processing instruction to include at the beginning of a document to name a stylesheet to apply to that document... The best thing about specifying a document's stylesheet with this xml-stylesheet processing instruction is that it lets you use the document and designated stylesheet with a web browser. With the numbers.xml document and stylesheet shown above both sitting in the same directory of your hard disk, you can tell the current versions of Mozilla and Internet Explorer to open up numbers.xml and you'll see the result of the transformation. The rendered result should look the same as if you had opened an HTML file created by a command-line XSLT processor using the same input and stylesheet files... Keeping your JavaScript code in a separate file is cleaner, but putting it between the <script></script> tags gives you a powerful opportunity: you don't have to hardcode your JavaScript code in your XSLT stylesheet; you can write XSLT logic to generate JavaScript code based on dynamic conditions such as values in your input. Next month, we'll see how this can let you turn nested a elements in an XHTML source document into one-to-many links implemented as pop-up menus in the rendered result..."

  • [February 07, 2003] "Is GML only for Internet GIS?" By Mark Prins (CARIS Geographic Informations Systems BV). In Directions Magazine (February 07, 2003). "Geography Markup Language (GML) is a standardized means of storing geographic information in eXtensible Markup Language (XML) encoded files specified by the openGIS Consortium. XML, an open, ASCII based, format uses descriptive tags to store data doing away with any proprietary vendor specific formats. Tags may be nested within each other and may be extended in an object oriented like manner to suit your own data model, while maintaining compatibility with the standard... The OpenGIS consortium has defined a software interface called the Web Feature Server (WFS) that will allow you to offer an online GML service. Most vendors have a Web Feature Server implementation available as off the shelf package, usually this supports vendor specific data formats only. To generate maps without advanced clients you can use another OpenGIS specified software interface: the Web Map Server (WMS). The WMS will render a map as an image based or return attribute data in a predefined format, among others GML... All OpenGIS Web Feature Server implementations, such as the CARIS Spatial Fusion transactional WFS, support GML; most Web Map Servers have some level of GML support (for attribute data and bounding box definition). As far as spatial databases (like Oracle Spatial and PostGIS) are concerned, most have support for GML loading and export. Making a converter for reading GML files into your desktop GIS is almost trivial because of the large number of XML parsers and libraries available and wide support for scripting of applications through use of languages such as Visual Basic for Applications. An example; GeoPortal and the CARIS Spatial Fusion services CARIS has implementations of both the Open GIS specified Web Feature Server (WFS) and Web Map Server (WMS) as part of the CARIS Spatial Fusion suite of products. Both a standalone as well as a cascading Web Map Server (cWMS) are available... As the GML standard matures to a more robust spatial format with support for topology, versioning and indexing we will see more and more applications supporting GML as a 'native' format. Geographic Markup Language will enable the step from Geographic Information System to Geographic Information Infrastructure..." For details on GML V3.0, recently released, see: "OGC Announces OpenGIS Geography Markup Language Implementation Specification (GML 3)." General references in "Geography Markup Language (GML)."

  • [February 07, 2003] "XML Serialization in the .NET Framework." By Dare Obasanjo (Microsoft Corporation). In Microsoft MSDN Library (January 23, 2003). ['Dare Obasanjo discusses how XML serialization lets you process strongly typed XML within the .NET Framework while supporting W3C standards and improving interoperability. A FAQ is also included in this article.'] "The primary purpose of XML serialization in the .NET Framework is to enable the conversion of XML documents and streams to common language runtime objects and vice versa. Serialization of XML to common language runtime objects enables one to convert XML documents into a form where they are easier to process using conventional programming languages. On the other hand, serialization of objects to XML facilitates persisting or transporting the state of such objects in an open, standards compliant and platform agnostic manner. XML serialization in the .NET Framework supports serializing objects as either XML that conforms to a specified W3C XML Schema Definition (XSD) schema or that is conformant to the serialization format defined in section five of the SOAP specification. During XML serialization, only the public properties and fields of an object are serialized. Also, type fidelity is not always preserved during XML serialization. This means that if, for instance, you have a Book object that exists in the Library namespace, there is no guarantee that it will be deserialized into an object of the same type. However, this means that objects serialized using the XML serialization in the .NET Framework can be shipped from one machine to the other without requiring that the original type be present on the target machine or that the XML is even processed using the .NET Framework. XML serialization of objects is a useful mechanism for those who want to provide or consume data using platform agnostic technologies such as XML and SOAP. XML documents converted to objects by the XML serialization process are strongly typed. Data type information is associated with the elements and attributes in an XML document through a schema written in the W3C XML Schema Definition (XSD) Language. The data type information in the schema allows the XmlSerializer to convert XML documents to strongly typed classes... The features delivered by XML serialization in the .NET Framework provide the ability to process strongly typed XML seamlessly within the .NET Framework. Strongly typed XML documents can leverage the benefits of the .NET Framework, such as data binding to Windows and ASP.NET Forms, a rich collections framework, and a choice of programming languages. All of this is done while taking care to support W3C standards like XML 1.0 and XML Schema, meaning that there is no cost to the interoperability of XML when attempting to transfer or persist such strongly typed documents..."

  • [February 07, 2003] "Executable UML: Diagrams for the Future." By Gerry Boyd. In WebBuilder (February 05, 2003). ['Executable UML marries an abstract program model with a platform-specific model compiler that outputs an executable application. If successful, this combination holds the potential for a major shift in development tasks.'] "Executable UML is the next logical, and perhaps inevitable, evolutionary step in the ever-rising level of abstraction at which programmers express software solutions. Rather than elaborate an analysis product into a design product and then write code, application developers of the future will use tools to translate abstract application constructs into executable entities. Someday soon, the idea of writing an application in Java or C++ will seem as absurd as writing an application in assembler does today. And the code generated from an Executable UML model will be as uninteresting and typically unexamined as the assembler pass of a third generation language compile is today. This shift is made possible by the confluence of four factors: (1) The development of the Model Driven Architecture (MDA) standards by the Object Management Group (OMG); (2) The adoption of the Precise Action Semantics for the Unified Modeling Language specification by the OMG in November of 2001; (3) A proposed profile of UML 'Executable UML' supports creating a complete and implementation-neutral self-contained expression of application functionality; (4) The availability of high-quality Model Compilers and Virtual Execution Environments (VEEs) that provide "out of the box" platforms upon which Executable UML models can execute... This article reviews these four factors in greater detail and discusses the implications of them relative to the future of software development..."

  • [February 07, 2003] "E-Authentication Waits For Rest Of The Pack." By Dipka Bhambhani. In (February 05, 2003). Technically, e-Authentication is ready and waiting. The [US] government has had a working prototype of the online certificate authentication gateway for several months... Now the initiative's leaders at the General Services Administration await the completion of the other 24 Quicksilver e-government initiatives and for the Office of Management and Budget to set security policies. 'The gateway has to come up with some assurance levels based on risks of e-gov initiatives,' said Steve Timchak, program executive for e-Authentication in GSA's Office of Governmentwide Policy. OMB is supposed to develop those policies, he said. As agencies work on their e-government projects this year, OGP is developing a written taxonomy that will establish a process by which credential providers such as smart-card developers, biometric systems providers and certificate authorities can link to the gateway... OMB will create the common policies, protocols and rules for all initiatives that connect to the gateway. For agencies securing transactions with digital certificates, the Federal Bridge Certification Authority will link PKI trusted domains. 'The gateway will interface with the bridge and validate PKI credentials,' Timchak said. A visitor goes through or to an application via an agency's URL, presents some form of credential to use an agency initiative, and that information would pass back to the gateway for validation, he said... Digital certificates would go through the Federal Bridge to a certification authority for validation..."

  • [February 07, 2003] "Web Services Market Up For Grabs." By Dawn Kawamoto and Mike Ricciuti. In CNET (February 07, 2003). "The battle to control the Web services software market is still up for grabs, according to conflicting results from two recent surveys. Fifty-eight percent of system integrators--those companies that assemble working business applications from off-the-shelf software -- listed Microsoft .Net-based products as their favorite for building Web services programs, according to a survey of 44 companies conducted by Gartner Dataquest. But another Gartner survey of 138 corporate customers who have hired or plan to hire systems integrators showed that 39 percent favored products based on the Java 2 Enterprise Edition (J2EE) specification, the rival to .Net in the Web services world. Such products are sold by IBM, BEA Systems and Oracle, among other companies... Gartner analysts said the apparent contradiction is a reflection of the Web services market's immaturity. When looking for Web services technology, companies tend to go with what they already know... According to market projections from IDC, the total value of the Web services market will reach US$21 billion by 2007. Although only 5 percent of companies completed Web services projects in 2002, more than 80 percent are expected to have some type of Web services project under way by 2008, IDC predicted..."

  • [February 07, 2003] "Liberty Alliance Releases ID Management Specification. White Paper Explains Possible Interoperability." By Paul Roberts. In InfoWorld (February 07, 2003). "Amid growing concern that it is being overshadowed by Microsoft's .Net Passport technology, The Liberty Alliance Project released a new specification Thursday to explain how the organization's federated identity model might one day coexist with Passport and other identity management systems. The technical white paper, entitled 'Identity Systems and Liberty Specification version 1.1 Interoperability,' compares and contrasts the consortium's federated identity model against .Net Passport, Verified by Visa, and other third-party authentication systems. The paper was produced to address questions and misconceptions about the Liberty Alliance model, said Paul Madsen, the paper's author and a consultant in the Advanced Security Technologies group at Entrust. 'The paper was motivated less to define a framework for Liberty working together with other systems than to address confusion in the marketplace about what Liberty was and how it would work with other systems, and sometimes compete with those other systems,' Madsen said. In particular, the paper was written to address the misconception that Liberty was a service akin to Microsoft's .Net Passport. Unlike .Net Passport, Liberty is a set of specifications for protocols that can be implemented by different organizations which become Passport-like user authentication services... While it may be fair to compare Passport to a particular implementation of the Liberty specifications, comparing the consortium's specifications to Microsoft's service is not particularly useful, Madsen said. The white paper also points out fundamental technical differences between .Net Passport and the Liberty specifications. For example, The Liberty Alliance specifications back the use of Security Assertion Markup Language (SAML) for exchanging authentication tokens as compared with Passport's proprietary schema, and the two authentication systems differ in the way they communicate tokens from one site to the next. 'There were a lot of misconceptions about how Liberty compares to Passport. We wanted to set out the differences and, recognizing those, set out some scenarios where Liberty and Passport can exist,' Madsen said. On that score, the new white paper proposes a number of scenarios in which .Net Passport and Liberty might work together. In one scenario, a third-party Web site might act as an identity provider in a Liberty 'circle of trust' (COT), creating SAML assertions for other service providers while also existing as a Passport member site, processing tokens issued by In this scenario, would then act as a 'mediator' between the Liberty-governed domain and the Passport domain, converting Passport tickets into SAML assertions and vice versa. In a second scenario, a service provider could exist in a Liberty COT and as a Passport member. Either authentication system could be used, depending on the nature of the service being requested, with Passport used for lower-security consumer transactions and Liberty for transactions that require stronger authentication..." Document reference: 2003-02-20 bibliographic entry. See general references in "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [February 05, 2003] "An Introduction to Using XLIFF. Technical Aspects and Implementation of XML Localisation Interchange File Format." By Yves Savourel (RWS Group). In MultiLingual Computing and Technology #54, Volume 14, Issue 2 (March 2003), pages 28-34. "XLIFF Version 1.0 was published at the beginning of 2002 and is now used in production. This article will give you a technical overview of XLIFF, describe the different parts, and explain how they fit together." The OASIS XML Localization Interchange File Format Technical Commmittee has been chartered to "define, through XML vocabularies, an extensible specification for the interchange of localization information. The specification will provide the ability to mark up and capture localizable data and interoperate with different processes or phases without loss of information. The vocabularies will be tool-neutral, support the localization-related aspects of internationalization and the entire localization process. The vocabularies will support common software and content data formats. The specification will provide an extensibility mechanism to allow the development of tools compatible with an implementer's own proprietary data formats and workflow requirements." See: (1) "XLIFF 1.0 Specification" (OASIS Committee Specification, 15-April-2002); (2) XLIFF references in "XML Localization Interchange File Format (XLIFF)"; (3) general references in "Markup and Multilingualism."

  • [February 05, 2003] "Another Step Closer to B2B." By John K. Waters. In Application Development Trends (February 05, 2003). "Talking about business forms and purchase orders isn't all that sexy, Jon Bosak admits, but establishing standard schemas for the electronic versions of such documents is a critical business issue. The global interoperability of business processes simply cannot occur without the semantic standardization of the messages exchanged in business transactions,' Bosak told eADT. And he ought to know. Bosak, who championed the development of XML 1.0 through the World Wide Web Consortium (W3C), has been spearheading efforts to develop a royalty-free data representation standard for electronic commerce, currently undertaken by the Organization for the Advancement of Structured Information Standards (OASIS). Last week, the OASIS Universal Business Language (UBL) Technical Committee announced that it was releasing the first draft of the UBL schema. The purpose of UBL is to unify the various business 'dialects' of XML currently used in electronic commerce. Its backers hope that it will become an international standard for electronic trade. The UBL schemas contained in the review package specify machine-readable XML representations of seven basic business forms, including Order, Order Response, Simple Order Response, Order Cancellation, Dispatch Advice, Receipt Advice and Invoice. Together, they can be used to implement a generic buy/sell relationship or supply chain whose components fit existing trade agreements and are immediately understandable by workers in business, supply-chain management (SCM), Electronic Data Interchange (EDI), accounting, customs, taxation and shipping..." See details in "UBL Technical Committee Releases First Draft of XML Schemas for Electronic Trade"; general references in "Universal Business Language (UBL)."

  • [February 05, 2003] "Python: Language of Choice for EAI." By Aron Trauring (CEO, Zoteca). In EAI Journal Volume 5, Number 1 (January 2003), pages 43-46. ['If you think that the language for Web services is a straight fight between Java and C#, think again. Python is supported by the big Web services vendors, including Microsoft. This object-oriented, high-level interpreted language may be ideal for EAI.'] "... Python plays well with programming standards. Many Python extensions are available that support almost all Internet standards, including Common Object Request Broker Architecture (CORBA), Component Object Model (COM), Simple Object Access Protocol (SOAP), eXtensible Markup Language (XML), and others... Twisted is an opensource, Python-based framework and event-driven network that provides powerful, scalable, and flexible EAI capabilities. At the core of Twisted is its network layer, which can be used to rapidly integrate any existing protocol and model new ones. Whenever the need arises to develop a new protocol, the asynchronous, multiplexed, and two-way Remote Object Protocol (ROP) can be used to quickly implement it. Because the ROP is used with object-level abstractions, changes can be made easily, and new features added, without having to deal with the design restrictions and application development complexities of a custom protocol. Out of the box, Twisted supports several service protocols. These include but are not limited to: HyperText Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Lightweight Directory Access Protocol (LDAP), Domain Name System (DNS), Sockets Server Version 4 (SOCKSv4), Secure Shell (SSH), Internet Relay Chat (IRC), Telnet, Post Office Protocol 3 (POP3), an America Online's instant messaging... Developers can immediately use these protocols without having to spend time reimplementing them. In addition, Twisted can talk to multiple, industrystandard Database Management Systems (DBMSes). It also can be used to communicate with COM servers and to control and integrate with standard Windows applications. Unlike other frameworks designed to address a specific domain, Twisted is designed to simultaneously support both multiple frameworks and multiple protocols. So it's useful for implementing Websites, Web services, e-mail servers, or instant messaging servers. Moreover, these services can all run in the same process..." See: "XML and Python."

  • [February 05, 2003] "Standards Muddy 'Open' Waters." By Ed Scannell, Heather Havenstein, and Tom Sullivan. In InfoWorld (February 05, 2003). "Amid the convergence of technologies across the IT stack, a host of products emerging from the world of open source, XML, and Web services are challenging traditional notions of open solutions. IBM, Apple, Microsoft, Sun Microsystems, and Nokia are passionate about their "open" mantra in the quest to address interoperability concerns. But the strategy itself is under threat of collapsing under its own weight. While many vendors ascribe to all the right Internet-based open standards, layers of proprietary technology that limit a company's ability to build a genuinely open infrastructure can lurk beneath the surface..."

  • [February 05, 2003] "Interwoven Unties MetaTagger From TeamSite." By [Seybold Bulletin Staff, edited by Patricia Evans]. In The Bulletin: Seybold News and Views On Electronic Publishing Volume 8, Number 18 (February 05, 2003). "Seeking to broaden the appeal of its MetaTagger technology, Interwoven has decoupled the classification/metatagging tool from TeamSite in the 3.5 release of the product announced Tuesday [2003-02-04]... over the past two years, Interwoven has expanded MetaTagger to include automatic summarization, keyword generation and pattern extraction. The product now includes seven engines, arguably more metadata-adding tools than any other content-management vendor offers directly. Though MetaTagger continues to bolster TeamSite's stature, 'the product is large enough that can it stand on its own,' asserts Mike Svate, senior marketing product manager at Interwoven. In that context, its ties to TeamSite were hampering Interwoven's ability to sell MetaTagger as a separate product. In version 3.5, the TeamSite-specific functions are either rolled into the product or left behind... New in 3.5 is MetaTagger Studio, a taxonomy-development environment. It makes the process of refining MetaTagger taxonomies more interactive, even allowing the classification expert to tweak the settings by testing individual documents offline, in addition to the standard procedure of feeding the system a group of sample documents. The Studio also adds new ways of building taxonomies, such as importing a directory structure from a file system or Web site. Interwoven's focus for MetaTagger is currently on classifying Web content and business documents to improve the quality and effectiveness of corporate portals and intranet knowledge bases... For Interwoven to have any chance of selling MetaTagger as a stand-alone product, it had to cut the TeamSite umbilical cord. But if it wants to give the market an even clearer sense of its commitment, the company should go ahead and develop the interfaces to competing content-management systems..." ['MetaTagger Extractor looks inside XML and HTML content to identify specific values like titles, author names, and other attributes to assign as metadata.'] See details in the announcement: "Interwoven MetaTagger 3.5 Dramatically Enhances Effectiveness of Enterprise Portals with Highly Relevant Content. Metadata Platform Now Integrates with Any Content Source to Enhance Key Portal Components Like Search and Personalization, Boosting Productivity and Lowering Support Costs."

  • [February 05, 2003] "Transactions and Web Services." By Boris Lublinsky (CNA Insurance). In EAI Journal Volume 5, Number 1 (January 2003), pages 35-42. ['Transaction processing is the cornerstone of enterprise computing. Large enterprises in transportation, finance, insurance, telecommunications, manufacturing, and other industries depend on transactional processing for funds transfer, payments processing, electronic communications, inventory control, and more. Yet Web services don't provide execution capability. What's missing? Two new standards aim to introduce transactions to Web services implementations.'] "The new transaction processing standards attempt to bring the best features of transaction processing to Web services. These include: (1) Web Services Transaction (WS-Transaction), which is being developed by IBM, Microsoft, and BEA. (2) Business Transaction Protocol (BTP), which is being developed by the Organization for the Advancement of Structured Information Standards (OASIS), with help from BEA, Bowstreet, Choreology, Entrust, Hewlett-Packard, Interwoven, IONA, SeeBeyond, Sun Microsystems, and Talking Blocks... Both transaction protocols described here are designed to extend the existing X/Open transactional model to support Web services. To evaluate the most applicable approach to transactions in the Web services paradigm, we need to answer two basic questions: What are Web services? What are the most appropriate uses for Web services? Because Web services invocations are expensive, more people agree today that Web services should provide access to coarse-grain executables. This makes coordination of Web services based on the true XA transactions unrealistic. This would lead to the lock starvation of the underlying transactional resources Web services use. Business activities or cohesions based on the compensating transactions seem to be the most appropriate approach for transaction implementations in Web services. Web services rarely execute by themselves; the glue used to execute multiple Web services is either business process or high-level services orchestrating execution of multiple services to achieve a particular goal. Every service is designed as a generic implementation with no knowledge about the process or high-level service that's going to use them. On the other hand, a business process or high-level service knows the exact functionality of participating services and the context of their execution. Semantics of the compensation transactions differs greatly from the semantics of the transaction rollback. Unlike the case of the ACID transactions, where every particular action is rolled back as if it never happened, functionality of compensating transaction strongly depends on the point in the process at which compensation is invoked. The compensating transaction is defined not by the service itself, but rather by the business process or high-level service (the context in which the service is executed). This leads to a conclusion that compensating transactions aren't defined by a particular service, but rather by a business process or high-level service using particular services. This means that association of the compensating transactions with the particular service (the way it is defined in both standards) is not really a way compensating transactions are likely to be used. A better solution for introducing transactions to Web services is a proper design of the business process, supporting both business processing and compensating activities. A business process is better aware of the 'big picture' and, as such, is in a better position to decide which particular compensating activity is appropriate at each point in the process..." See: (1) Web Services Transaction (WS-Transaction); (2) Business Transaction Protocol.

  • [February 05, 2003] "What the Hell is IBM Information Integrator?" By Phil Howard. In The Register [IT Analysis]. (February 05, 2003). "Yesterday IBM announced its new Information Integrator family of products... ultimately this will consist of three offerings (although in the longer term the three products will probably converge), based on SQL, an object oriented API and an XML API respectively. However, the last of these, which will use XQuery, has not been announced yet, as it is awaiting the final definition of the XQuery standard. The two products that are announced (in beta) are Information Integrator, previously known by the code name of Xperanto; and Information Integrator for Content, where the former is the relational product and the latter is designed to provide integration in content management environments. In practice, the latter represents a re-positioning of IBM's Enterprise Information Portal (a subset of WebSphere Portal, which is really the company's enterprise information portal) for accessing mainly IBM content repositories together with other content and data sources... In today's article I will discuss the details of Information Integrator and tomorrow I will consider the circumstances under which it will be most appropriate to look at Information Integrator as opposed to alternative technologies such as ETL (extract, transform and load) tools. I will also consider some of the environments in which use of Information Integrator may be most beneficial... it is also important to realise that Information Integrator is not limited to accessing relational data sources - it can also access XML, flat files, Microsoft Excel, ODBC, Web and other content stores and so on, although updates and replication are limited to relational sources in the first release. Thus (for those of you who know the product) the full capabilities of DataJoiner have not been implemented in this release. There are some key features of Information Integrator that should be mentioned. In particular, you can query data wherever it resides, as if it was at a single location, with a single view across all the relevant data sources. The product supports queries by caching query tables across federated sources, while the optimiser will validate the SQL used against the source database and will automatically compensate if the relevant syntax is not supported on the remote database. Other features of the federation capabilities of the product include the ability to publish the results of a query to a message queue and to compose, transform and validate XML documents..." See the announcement: "IBM Simplifies Management and Integration of Business Information With New Software. First-of-Its-Kind Data Management Product Enables Businesses to Improve Productivity and Reduce Costs."

  • [February 05, 2003] "XML Matters: reStructuredText. A Light, Powerful Document Markup." By David Mertz, Ph.D. (Floating Signifier, Gnosis Software, Inc). From IBM developerWorks, XML zone. February 2003. ['The document format called reStructuredText has been adopted as one of the official source formats for Python documentation, but is also useful for other types of documentation. reStructuredText is an interesting hybrid of technologies -- in syntax and appearance it is similar to other "almost-plaintext" formats, but in semantics and API it is very close to XML. David takes a look at this format and shows you how existing tools can transform reStructuredText into several XML dialects (docutils, DocBook, OpenOffice), along with other useful formats like LaTeX, HTML, and PDF.'] "In the past, this column has looked at alternatives to XML -- document formats that satisfy many of the same purposes for which you might use XML. reStructuredText continues this tradition. In contrast to YAML, which is good for data formats, reStructuredText is designed for documentation; in contrast to smart ASCII, reStructuredText is heavier, more powerful, and more formally specified. All of these formats, in contrast to XML, are easy and natural to read and edit with standard text editors. Working with XML more-or-less requires specialized XML editors, such as those I have reviewed previously. reStructuredText (frequently abbreviated as reST) is part of the Python Docutils project. The goal of this project is to create a set of tools for manipulating plaintext documents, including exporting them to structured formats like HTML, XML, and TeX. While this project comes from the Python community, the needs it addresses extend beyond Python. Programmers and writers of all types frequently create documents such as READMEs, HOWTOs, FAQs, application manuals, and, in Python's case, PEPs (Python Enhancement Proposals). For these types of documents, requiring users to deal with verbose and difficult formats like XML or LaTeX is not generally reasonable, even if those users are programmers. But it is still often desirable to utilize these types of documents for purposes beyond simple viewing (such as indexing, compilation, pretty-printing, filtering, etc.). The Docutils tools can serve the needs of Python programmers in much the same way that JavaDoc helps Java programmers, or POD helps Perl programmers. The documentation within Python modules can be converted to Docutils document trees, and in turn to various output formats, usually within a single script. But for this article, the more interesting use is for general documentation. For articles like this, and even for my forthcoming book, I write using smart ASCII; but I am coming to feel that I would be better off with the formality of reStructuredText... As of this writing, the Docutils project is under development, and has not released a stable version. The tools that exist are good, but the overall project is a mixture of promises, good intentions, partial documentation, and some actual working tools..."

  • [February 04, 2003] "UBL Set to Shake Up Electronic Commerce." By John Taschek (eWEEK Labs Director) and Jon Bosak (Sun Microsystems). In eWEEK (February 04, 2003). "Since the advent of electronic commerce, standards bodies have been in search of a lingua franca for business communication. Unfortunately, early efforts such as EDI (electronic data interchange) had a high barrier to entry in terms of cost and complexity. However, a new specification -- Universal Business Language (UBL) -- may actually deliver on the common-language promise. More recently, XML and the public Internet were thought to be likely candidates to succeed EDI. XML, however, was so open that it fragmented into various schemas. Although these schemas are technically compatible with each other, they are next to useless for organizations trying to conduct e-commerce -- just because the alphabets of various languages share the same characters does not mean that everyone is speaking the same language. Still, four of the major XML schemas are widely used: cXML (Commerce XML) is used for automated order receipt and fulfillment, and was spearheaded by Ariba and Sterling Commerce, among others. xCBL, which was developed by Veo Systems (purchased by Commerce One) and was funded in part by the Department of Commerce, is widely used by Commerce One customers and their suppliers. RosettaNet and OAGIS (Open Application Group Integration specification), meanwhile, are two of the more mature standards for business-to-business interoperability. UBL, built on xCBL and governed by OASIS, is meant to be exactly what its name implies: a universal standard for business-to-business communication. The UBL technical committee released its first draft of the specification last week. eWEEK Labs Director John Taschek recently spoke with Sun Microsystems Inc. XML Architect Jon Bosak -- UBL's leading proponent, a founding member of OASIS and the former chair of the W3C XML Coordination Group -- about how UBL can play a part in enterprises right now..." Excerpts: [Jon Bosak:] "... The [UBL XML] schemas we've just released for review are sufficient to implement the basic buy/sell relationship that accounts for most actual trade. They will need further customization for the specialized versions used in certain industries, but I believe that those specialized versions will be based on the generic documents and data components instantiated in this release. The UBL component library has been developed in close coordination with [UN/EDIFACT] guidelines and is slated for contribution to the UN business semantic registry. So what you're seeing now is quite likely very close to what the world will use for XML-based B2B for the next few years. If I were in a business organization considering a move to XML, I'd want to take a good hard look at this review package to make sure that it meets my business requirements before it moves on to standardization... I think it's time for us to face the fact that electronic commerce is a form of commerce, not a form of electronics. People seem to forget that we're not inventing a worldwide system of trade; we've already got one of those, and it's taken about 4,000 years to put it in place. It's got its own methods, its own laws, its own systems of customs and taxation. It's unrealistic to think that we'll transform traditional business and legal systems overnight. What we need are technologies like UBL and ebXML that allow businesses of every size to make the transition to electronic commerce incrementally, to upgrade pieces of their infrastructure in place so that they can achieve maximum ROI within their particular context and with a minimum of disruption to their existing business. By providing standard XML versions of EDI messages and paper documents, UBL is designed to enable this incremental transition to electronic commerce..." With respect to the recently released UBL schemas, see "UBL Technical Committee Releases First Draft of XML Schemas for Electronic Trade." General references in "Universal Business Language (UBL)."

  • [February 04, 2003] "What's New in UDDI 3.0 - Part 2." By Frank Sommers. From WebServices.Org. February 03, 2003. ['The second in a series of articles that provide an in-depth review of the most recent UDDI (Universal Description, Discovery, and Integration) Web service registry standard.'] "While the description of data structures takes up about half of Version 1 and 2 of the UDDI specifications, the other half defines the manner in which you can interact with those data structures. That interaction occurs via UDDI's public, or programmer's, API. That API is grouped into API sets according to functionality. The UDDI specifications define those API sets via the XML messages that must exchange between a UDDI registry and a registry client. Those XML messages are transmitted via the SOAP protocol... The most important API sets define the publishing and querying operations for a UDDI registry... Versions 3's most significant API changes concern the addition of four new API sets, and modifications to the semantics of existing API functions. The latest changes result from feedback during UDDI's two years of history, and aim to ease the development of business-to-business e-commerce applications... The UDDI 3 subscription API allows WidgetsRUs to be automatically notified when a new supplier publishes its Web service, or when an existing supplier's Web service registration changes. That eliminates the need for periodic administrative querying of UDDI, and makes incorporating new suppliers in the internal MIS system more timely... Note that the UDDI 3 subscription API is an optional API set: A UDDI registry is not required to support it. While it is a very useful feature in support of business-to-business e-commerce, it remains to be seen how many UDDI 3-compliant registries will offer it..." See also: (1) "What's New in UDDI 3.0 - Part 1"; (2) "UDDI Version 3 Features List". General references in "Universal Description, Discovery, and Integration (UDDI)."

  • [February 04, 2003] "Comparing Style Sheet Languages." By Håkon Wium Lie. With contributions from Bert Bos, Christopher R. Maden, Paul Grosso, Arved Sandstrom, and Vincent Quint. Posted 2003-02-04 to the DSSSL List ( "The table below compares how various style sheet languages express common (are they?) tasks. Text in red indicates that I'm not quite sure if the code is correct. Help with proofreading the examples, as well a filling in the open cells, is appreciated... Christopher R. Maden has helped out with DSSSL examples in the past, and I've also found inspiration in the the DSSSL Cookbook. Feel free to suggest text for the blank areas... The challenges are meant to reflect needs of authors, but are also designed to show off crucial differeneces and similarities in the various style sheet languages..." The document identifies a number of tasks ("challenges") for style languages and provides sample code if the style language is up to the challenge. Style languages include CSS, FOSI, DSSSL, XSL, P93 [Amaya/Thotlib 1993], and PSL. Example challenges: (1) 'Set the font size of all "H1" elements to 20pt'; (2) 'Set H1's font size to the double of the parent'; (3) 'Set H1's font size to the double of the root element's font size'; (4) 'Select any P element that is the first P child element of its parent'; (5) 'specify that content should flow into two columns'; (6) 'In an HTML table, select all odd-numbered rows'.

  • [February 04, 2003] "Business Processes and Workflow in the Web Services World. A Workflow is Only As Good As the Business Process Underneath It." By Margie Virdell (e-business Architect, IBM Developer Relations). From IBM developerWorks, Web services. January 2003. "This article looks at business processes, their relationship to workflow and Web services today, and the challenges that lie ahead... A business process can be defined as a set of interrelated tasks linked to an activity that spans functional boundaries. Business processes have starting points and ending points, and they are repeatable... The standardization of workflow behavior and interoperability is late to this game, trying to standardize everything all at once. Perhaps the standards bodies and their members should focus their efforts on finishing the basic, core workflow standard and not be so distracted by the outer layers..." References several workflow specifications, including: '(1)Wf-XML and Workflow Reference Model from the Workflow Management Coalition (WfMC): Wf-XML is an XML-based encoding of workflow interoperability messages. The Workflow Reference Model is a description of the underlying workflow system architecture. Wf-XML has no binding to SOAP and WSDL at this time. (2) WSFL IBM Web Services Flow Language: Specifies two types of Web services composition A) an executable business process known as a flowModel, and B) a business collaboration known as a globalModel. Compatible with SOAP, UDDI, and WSDL. (3) XLANG Microsoft's XLANG: Business modeling language for BizTalk, which is a component of .NET that enables EAI. BizTalk Orchestration is the workflow engine and BizTalk Orchestration Designer is the visual business process modeling tool based on XLANG. (4) BPEL4WS Business Process Execution Language for Web Services is the cooperative merging of WSFL and XLANG for Web services orchestration, workflow, and composition. It has not yet been submitted to an IT standards organization. (5) ebXML BPSS The eBusiness Transition Working Group carries forward the definition of workflow conversation and orchestration in the Business Process Specification Schema (BPSS) layer of ebXML, which defines many protocols and layers for XML-based e-business. (6) WSCI Sun/BEA/Intalio/SAP consortium's Web Services Choreography Interface 'is an XML-based interface description language that describes the flow of messages exchanged by a Web Service participating in choreographed interactions with other services.' (7) WSCL W3C's Web Services Conversation Language: A submission by Hewlett-Packard to the W3C, it allows defining the abstract interfaces of Web services (that is, the business level conversations or public processes supported by a Web service), the XML documents being exchanged, and the sequencing of those documents. (8) PIPs, RosettaNet's Partner Interface Process: defines business processes between trading partners via specialized system-to-system XML-based dialogs. Many PIPs have been defined for various partner scenarios. (9) JDF CIP4's Job Definition Format is an upcoming workflow industry standard for the Graphics Arts industry designed to simplify information exchange among different applications and systems..."

  • [February 04, 2003] "IBM Unveils Xperanto Data Integration Technology. Two Products Set for Beta Release." By Paul Krill. In InfoWorld (February 04, 2003). "IBM on Tuesday will unveil beta releases of the first two products from its Xperanto federated data integration project, DB2 Information Integrator and DB2 Information Integrator for Content... Although users will not need IBM's DB2 database to make the products function, the offerings are based on IBM's DB2 database optimizer technology for optimizing data integration, and also on IBM's federated database technology, which has been code-named Garlic and enables joining of data from multiple sources... IBM is positioning the products as useful for applications such as customer relationship management, where call center operators need to pull together information about customers that resides in multiple databases as well as in unstructured sources such as e-mail messages. XML documents also can be integrated. The multiplatform technology will run on systems ranging from Hewlett-Packard and Sun hardware to Linux and Windows and IBM's own AIX and mainframes. Software from multiple vendors can be integrated via the products, including relational and non-relational data sources such as documents and images. Web services-based data, such as a real-time stock feed, also could be integrated via Xperanto products. Of particular interest in IBM's new products are an XML interface and support for Hidden Markov Model Regression, for linking databases to calculations, [Craig] Stewart said. IBM's offerings will radically alter the EII (enterprise information integration) marketplace by adding write-back capabilities to distributed data, rather than just enabling the reading of this data, said analyst Philip Russom, research director at Giga Information Group, in Cambridge, Mass. 'In a nutshell, it's going to change the face of the EII marketplace,' Russom said. 'IBM closes the loop there with data integration; with the others it's one way out'... IBM beta cycles typically take six months, Perna said. Thus, DB2 Information Integrator and DB2 Information Integrator for Content will be generally available for shipment approximately by August. A subsequent Xperanto product will add support for XQuery, the XML-based query language. IBM has no date for when that product will be ready..." See the announcement: "IBM Simplifies Management and Integration of Business Information With New Software. First-of-Its-Kind Data Management Product Enables Businesses to Improve Productivity and Reduce Costs."

  • [February 04, 2003] "IBM Releases Xperanto Beta." By Lisa Vaas. In eWEEK (February 04, 2003). "IBM Tuesday rolled out a beta version of the first product to come out of its years-long Xperanto data integration research project. Consisting of two products, the DB2 Information Integrator and the DB2 Information Integrator for Content, the software encapsulates IBM's goal to help customers access, integrate and analyze all forms of data within and beyond the enterprise. The offerings fall into the Armonk, N.Y., company's 'On-Demand' initiative. Nelson Mattos, IBM's director of information integration, said that the software's ability to integrate both structured and unstructured data -- including XML, e-mail, multimedia files, Web services, and even data from competitive sources such as Oracle Corp. and Microsoft Corp. databases -- will enable customers to make the right business decisions in real time, as opposed to having to think in advance about how to structure data queries... The two products support two separate development platforms: DB2 Information Integrator is tailored to the SQL-based, or structured data, developer community, whereas DB2 Information Integrator for Content supports the content management programming model, which is primarily geared to unstructured data. DB2 Information Integrator provides XML support for accessing and integrating XML documents as data sources and for generating XML documents as query results, but as of this beta release, support for XQuery -- the XML query language -- is still missing. According to an IBM spokeswoman, IBM is basing this initial offering on SQL, where there is both ample skill and a broad tool set that customers can exploit immediately. IBM plans to add XQuery support when it becomes a standard, likely next year, she said..." See details in "DB2 Data Management" and in the announcement: "IBM Simplifies Management and Integration of Business Information With New Software. First-of-Its-Kind Data Management Product Enables Businesses to Improve Productivity and Reduce Costs."

  • [February 04, 2003] "Internaut: How Agencies Could Use XML to Ensure Integrity of Data." By Shawn P. McCarthy. In Government Computer News Volume 22, Number 2 (January 27, 2003). "Who is the ultimate authority for government data quality? Chances are you trust the data you collect within your own agency. You 'sort of' trust other agencies' data, and you cross your fingers and hope for the best with everything else. Growth of Web services will aggravate this predicament. When data is assembled from multiple databases, Web sites and documents, it's tough to ensure an even level of quality, or to guarantee authoritative sources. So what exactly makes a piece of government data authoritative? [...] Extensible Markup Language can make it possible to establish a specific authority for all data in a collection -- not just Social Security numbers but literally every bit of data the government gathers. It's a matter of setting up the proper hierarchies. Government offices are unlikely to validate their local records directly with SSA every day. But they could pull their data from locations that do conduct daily validations with the ultimate authority... One possibility is ebXML, which stands for electronic business using XML. This suite of specifications is designed for conducting commercial business over the Internet in a services-based architecture, so it's not a complete answer for government data sharing. But ebXML does let users set up complicated mappings and hooks to pull data from appropriate Internet sources..." [Hmmm.]

  • [February 04, 2003] "Sun to Standardize Web Services in J2EE 1.4. Next Version Supports WS-I's Basic Profile Specification." By James Niccolai. In InfoWorld (February 04, 2003). "Sun Microsystems will incorporate an important specification with the next version of its enterprise Java platform that is designed to ensure interoperability among Web services applications. Version 1.4 of Sun's Java 2 Enterprise Edition (J2EE), which is due for release mid-year, will incorporate the Basic Profile specification developed by the Web Services Interoperability Organization. The WS-I is a multi-vendor group founded by IBM, BEA Systems, Microsoft and others to help define standards for the emerging Web services model. The Basic Profile defines a standard method for employing a handful of technologies that have become central to Web services. They include XML (Extensible Markup Language) and SOAP (Simple Object Access Protocol) for messaging, WSDL (Web Services Description Language) for describing services, and UDDI (Universal Description, Discovery and Integration) for looking them up on a network. Some developers have used those technologies already, but without the programming and data models laid out in the Basic Profile they have had no assurance that their applications will interoperate with those of other developers. Adding the Basic Profile to the next version of J2EE is intended to provide that assurance, said Ralph Galantine, a group marketing manager at Sun. Java licensees including Sun, Oracle, IBM and BEA are expected to release certified J2EE 1.4 products soon after the standard is finalized. As a member of the WS-I, Microsoft is also expected to back the Basic Profile, in a rare example of cooperation between Microsoft and its rivals in the Java camp. A Microsoft spokeswoman noted that the specification has yet to be finalized, but said Microsoft will support it in software products when it's completed. A draft of the WS-I Basic Profile was released in October, and at that time the group was shooting for completion early this year. .. The Web services model provides a way for linking different types of business applications together, either within an organization or, it's hoped, among partners, suppliers and customers for streamlining commerce. More ambitiously, proponents say, Web services can be used to "expose" business programs, such as a retirement plan application, as services that can be used by other companies. After a year of steady hype, however, the model has taken off only gradually and in a limited way, analysts have said. Concerns have been raised about security, a lack of clearly defined standards and the sheer complexity of the development work involved. Adding the WS-I Basic Profile is intended to go some way towards meeting some of those concerns..." See details in the announcement: "Sun Microsystems Drives Industry Towards Web Services Interoperability With Major Enhancements to Java Platform. Drives Innovation with Implementation of WS-I Specification Support in Java 2 Platform, Enterprise Edition (J2EE) Version 1.4." General references in "Web Services Interoperability Organization (WS-I)."

  • [February 04, 2003] "Java Specification Waits for Web Services." By Martin LaMonica. In CNET (February 04, 2003). "A revision to a much-anticipated Java standard will be delayed about three months in order to comply with guidelines designed to keep Web services interoperable... The J2EE 1.4 specification, which gives Java licensees the blueprint for building Java programming tools and server software, was set to debut in the first quarter 2003. The forthcoming J2EE specification incorporates Web services protocols, a set of standards and a programming method for connecting disparate computing systems. Adoption of Web services is accelerating as companies look for ways to lower the cost of sharing information. Sun representatives said the company chose to push back the finalized J2EE 1.4 specification in order to comply with interoperability guidelines set forth by the Web Services Interoperability organization (WS-I). The WS-I is a consortium of about 160 companies that examine existing Web services standards and provide guidelines and testing tools to assure compatibility of Web services products. The majority of the WS-I's members are IT providers, including industry heavyweights IBM, Microsoft, Sun and Oracle... Sun has already introduced Web services add-ons and developers are already writing Web services with Java. The J2EE 1.4 specification codifies the Web services support in an effort to ensure compatibility across purveyors of Java tools and application servers. Although IT providers and businesses can build customized applications using Web services, minor differences in Web services products can cause glitches. The WS-I's basic profile, set for to be finalized in the second quarter this year, is intended to clarify ambiguities in the different Web services standards to assure that products work together. The organization is now expanding its focus to Web services security..." See: (1) the text of the announcement; (2) J2EE website; (3) "Web Services Interoperability Organization (WS-I)."

  • [February 03, 2003] "XForms for Managing Forms-Based Data." By Anthony Tomasic. In XML & Web Services Magazine Volume 3, Number 7 (December 2002/January 2003), pages 24-27. ['XForms defines a mini XML-based programming language that simplifies data-entry implementations dramatically.'] "The W3C's draft XForms standard defines a set of XML elements that expand vastly the power of data-entry devices, such as browsers, to capture and validate forms-based data. The XForms Working Group has focused on device independence, data validation, and improved internationalization support to give you a new, elegant, and powerful way to code data-entry systems. By centralizing form behavior and data validation into a single location, XForms eliminate the hassle of browser scripting, the associated quagmire of multiple browser-language versions, and the mind-numbing coding of data-validation checks... You can leverage your knowledge of the existing W3C standards XForms builds on -- XML, XHTML, Cascading Style Sheets (CSS), XML Schema, XPath, and XML events -- for a lower XForms learning curve. XForms' features help you spend more time on creative, application-specific work and less time on routine infrastructure... A traditional implementation of this system would consist of four pieces. First, Java Server Pages (JSP) present a form, and, second, JavaScript client code validates some of the data entry. Data that users enter on a form goes to a server by HTTP POST, and the result is in Java. If data-entry errors are detected by the server-side validation code -- the third component -- the system delivers another form (containing the partially valid data the user entered on the previous form) to the user. Otherwise, the result of data entry is transformed into the fourth element -- an XML document -- and sent on for the next processing step. Web designers and developers might write a thousand lines of JSP, JavaScript, and Java code for even a simple form. An XForms-based implementation of the same system consists of three parts. The first is a set of XForms documents the Web designers and developers write. The second is a shared XML schema data model a developer or XML administrator defines. The third part -- an XForms standard engine -- interprets the XForms documents and XML schema. Each XForms document handles all four aspects of the traditional implementation. The documents reference the shared XML schema, designed explicitly for data entry, that provides explicit support for data validation. A user browser generates an HTTP request for an XForms document; the XForms engine manages the interaction with the user according to the rules defined in the XForms document; the XForms engine validates data entry; and valid data-entry results are delivered to the server as XML documents, ready for the next step in processing. This system requires far less time and labor to construct than the traditional implementation, and it's easier to modify..." See: "XML and Forms."

  • [February 03, 2003] "Localization Made Easy." By Claude Duguay. In XML & Web Services Magazine Volume 3, Number 7 (December 2002/January 2003), pages 35-38. ['Learn a powerful and easy approach to building a localization infrastructure for Java, based on XML.'] "XML is particularly well suited for handling multiple language problems, such as internationalization and localization, because XML is great at identifying the encoding scheme as part of its document header information. Java provides a powerful localization mechanism called resource bundles, which must either be coded into Java classes or handled by the default Java Properties object and its associated file format. XML seems better suited for this mechanism than property files; therefore, we'll build a localization infrastructure for Java that's based on XML. One task that the standard Java ResourceBundle class does well is handling the management of property filenames using a hierarchical naming convention based on Locale objects. A Locale is built up from three elements: Language, Country, and Variant. In practice, the Language is important and the Country is sometimes important, but the Variant is rarely used. Each element can be represented by two-letter symbols. Filenames can be built easily by using an underscore delimiter and a base filename... This approach to localization in Java is both powerful and easy to apply. The localization process is nontrivial on larger projects, but using XML may enable you to apply more flexible tools. Authoring tools can simplify the process, or you can employ a database to store records that are exported to XML during the build process. In either case, this approach can yield considerable savings in development time and effort, thanks to using XML resource bundles that provide all the benefits of the standard Java localization model, along with all the benefits of XML content management..."

  • [February 03, 2003] "Speaking XML." By Adam Bosworth (Vice President, Engineering, BEA Systems Inc). In XML & Web Services Magazine Volume 3, Number 7 (December 2002/January 2003), page 40. "XML poses some interesting challenges for programmers. This is the first of a series of columns in which I will look at XML's interaction with programming languages. XML's schema model is not as hardened as are types in a programming language, but in some ways it is richer. Language has nothing even remotely equivalent to mixed content, for example. Mapping XML into program data structures inherently risks losing semantics and even data because any unexpected annotations may be stripped out or the schema may be simply too flexible for the language... Today's programmer has two tools available to parse and manipulate XML files: the Document Object Model (DOM) and Simple API for XML (SAX). Both, as we shall see, are infinitely more painful and infinitely more prolix than the previous code example... While the DOM can be used to access elements, the language doesn't know how to navigate through the XML's structure or understand its schema and node types. Methods must be used to find elements by name... In short, the current situation is unacceptable. With the increasing ubiquity of XML both as a way to describe metadata and exchange information between programs and applications, and with the rocketing acceptance of XML Web services, it is becoming increasingly necessary for developers to directly access and manipulate XML documents. It should not require that they be rocket scientists to do so. In the next issue I'll discuss how work that's brewing in the developer community to address these matters holds extraordinary promise for developers everywhere..."

  • [February 03, 2003] "Web Service Versioning and Deprecation. An Easy-to-implement Strategy for Success." By Jeff Kenyon (Qwest). In WebServices Journal Volume 3, Issue 2 (February 2003), pages 18-24. "Current standards for SOAP, WSDL, and UDDI have no explicit support for the versioning and deprecation of Web services. This article introduces a means for Web service versioning and deprecation that is lightweight and flexible, and requires minimal development effort. This approach is intended primarily for use within a corporation using one or more internal UDDI registries. It may be applied in the UDDI public registry "cloud," but only for controlling versioning and deprecation for Web services controlled by a single provider... One approach to [versioning] requirements is a Web service versioning and deprecation strategy centered on applications searching for and binding to Web services by searching for services that support one or more required interfaces (tModels). The following scenarios illustrate the strategy, first from the Web service developer side, and then from the application developer side... to implement versioning and deprecation, the following roadmap is recommended: (1) Web service developer builds support for getVersion message into his or her service; (2) Web service developer registers Web service in UDDI; (3) Web service consumer programs to interface, rather than implementation; (4) Web service consumer checks for deprecation of interface on a regular basis. The process outlined in this document is reasonably lightweight. By limiting the requirements to placing versioning and deprecation data in standard locations, the developers are free to determine the best approach for implementation within their own application context. For an application that carries out a startup procedure at regular intervals, it may mean retrieving the appropriate binding upon startup and performing a deprecation check as a maintenance process once a month. For an application intended to run continuously, it may mean checking for more recent versions and deprecations once a week... One very important principle that Web service developers must follow is that any time they modify the WSDL interface, they must create a new tModel. A given tModel represents a version of a specification; just as you cannot go back and modify an existing specification and keep the original version number, you cannot modify a tModel (or the underlying versions of the documents it refers to) once released. Developers will code their applications to the interface described in the WSDL file attached to the tModel but if the Web service developer changes that interface without issuing a new tModel, application Web service calls will suddenly start breaking... This approach to versioning takes advantage of the features of UDDI, but relies upon the participation of both application and Web service developers in equal measure (sort of the Web services version of Mutual Assured Destruction). Application developers rely on Web service developers to implement the getVersion message, to maintain proper UDDI registry entries, and to continue supporting specific interfaces until they are officially deprecated. Web service developers rely on application developers to bind to a service based on the tModel required, and to use UDDI to find the binding for the latest version of a service supporting that tModel. Programming to interfaces, rather than to specific implementations of Web services, ensures that application developers will always use the most up-to-date Web service, and will be notified in a timely manner of all (and only) relevant interface deprecations..." [alt URL]

  • [February 03, 2003] "On the Road to Web Service-Level Management. Critical Components in the Creation of Self-Managing Computer Resoruces." By Mark Potts (Talking Blocks), Kevin Ruthen (IBM), and Heather Kreger (IBM). In WebServices Journal Volume 3, Issue 2 (February 2003), pages 26-31. "Web services is now delivering on the promise of interconnecting systems, within and between organizational boundaries. But the benefits of open interoperability of such distributed resources only increase the complexity of the computing environment that has to be managed... In this article we'll look at the role traditional systems management has played in the enterprise, requirements specific to managing a Web services environment, and how Web services in a managed environment can improve service levels while reducing the overhead and costs associated with managing complex distributed environments. In order for Web services to be manageable in an interoperable manner, three building blocks must be standardized: the basic manageability information that the components of the Web services architecture must support, how that information is accessed, and how the manageable components are discovered. The first of the building blocks, the minimum, basic information required for managing Web services and their environment, is being developed and standardized in the W3C's Web Services Architecture Working Group's Management Task Force. The Web Services Architecture includes a requirement that implementations of the architecture must be manageable. A task force was initiated to satisfy this requirement and is working to publish a manageability model for each of the components of the Web services architecture, i.e., services, hosting environment, and discovery agency. The manageability model must include identification, configuration, metrics, and events for the components. The second and third of the building blocks, access to and discovery of the management information, are being developed and standardized in the OASIS Management Protocol Technical Committee (MPTC). The MPTC is defining how to access manageability information for any managed resource using Web services. The same specification should also work for Web services as a specific type of managed resource. These building blocks have been discussed in terms of how to manage Web services in particular, but you can see that the same principles can be applied to managing IT resources in general. The use of Web services to expose IT resources on Grid systems is being developed and standardized by GGF at Globus. They are also defining how to manage these IT resources using Web services and Grid services. It is logical that they should be able to leverage the same foundation building blocks being developed by the W3C and the OASIS MPTC. Management of distributed computing environments has always been difficult, but the dynamic nature of Web services and the more loosely coupled nature of interactions make it even more difficult to control and administer. However, Web services, coupled with service-level management in a WSMP, offers opportunities for organizations to better leverage their existing IT investment, better manage IT assets and resources in alignment with business objectives, and introduce autonomic concepts in managing their IT infrastructure that effectively reduce the overhead and cost of managing and maintaining their operational systems. Web services management should not be seen as a requirement once services are deployed and proliferating throughout the enterprise, it is a requirement for day-one deployment of your first Web service..."

  • [February 03, 2003] "A Publish/Subscribe Mechanism for Web Services. Extending an Existing Event Broker." By Dmitri Tcherevik (Office of the CTO, Computer Associates). In WebServices Journal Volume 3, Issue 2 (February 2003), pages 10-15. "In this article I present the design and implementation of a publish/subscribe Web service that can be used by applications deployed on the Internet to subscribe for, send, and receive events using ubiquitous Internet protocols. In addition, the service can work as a gateway between SOAP/HTTP and traditional messaging protocols such as JMS, MSMQ, MQSeries and others. We implement the service as an extension of an existing event broker with help of an off-the-shelf Web services development kit. The publish/subscribe service can be used in a large number of applications that require asynchronous one-to-many messaging: system monitoring and management, information replication, instant messaging, peer-to-peer computing, and others. [Let's] examine the types of communication taking place between the agents and the management applications. First, there is a stream of system status messages flowing from the agents to the management applications. Second, there is a stream of control messages flowing back from the management applications to the agents. The case of control messages is relatively straightforward. A control message has a well-defined source and a well-defined destination. Therefore, it can be delivered easily using a traditional, one-to-one request/reply protocol. All of the available RPC mechanisms, such as IIOP, Java RMI, COM, or SOAP-RPC, satisfy this requirement. The case of status messages is much more interesting. Messages sent by agents are caused by changes in the environment and are therefore asynchronous by nature. In addition, an agent sending a message is decoupled from the management application and rarely knows who will be receiving the message on the other end. Finally, there can be many management applications receiving messages from a single agent. Therefore, what we have here is an asynchronous one-to-many type of messaging that cannot be carried over a traditional RPC link. Instead, a publish/subscribe mechanism is required. This problem has been successfully solved in many products and middleware frameworks. Typically, it is addressed by introducing a stand-alone publish/subscribe service, sometimes called an event broker or an event intermediary. From the point of view of the event broker, the world is divided into two types of entities: event producers and event consumers. Event producers advertise event types and raise events by sending messages to the event broker. Event consumers express interest in events by registering event subscriptions with the event broker. The event broker matches the two parties by forwarding events sent by the producers to endpoints registered by the consumers... In addition to forwarding events, the event broker can implement a slew of useful functions such as event filtering, event logging, guaranteed event delivery, event correlation, protocol translation, and others. In this article, however, I'll focus not on the functionality of the event broker but on the communication mechanism used by consumers and producers to access its services. In particular, we want to understand what it would take to deploy an existing event broker on the Internet as a Web service that could be easily accessed by other Web services and applications..." [alt URL]

  • [February 03, 2003] "Actional SOAPstation. Finding Order and Scale in Complexity." By Joe Mitchko. In WebServices Journal Volume 3, Issue 2 (February 2003), page 16. "Following SOAPswitch, its successful initial Web services product, Actional has announced a new product aimed at bringing order to the complex back-end configurations required to service a wide variety of specialized business partner-based Web services. SOAPstation is an entirely configurable, rule-based engine designed to address many corporate IS concerns when a SOAP request traverses the network. To begin, SOAPstation is a Web service proxy, which means that it will handle a SOAP request on behalf of the actual service point, do some processing, and then forward the request to the real service. To better understand SOAPstation's capabilities, let's follow the path of a Web service SOAP request as SOAPstation processes it through a series of message-processing blocks called a service group. SOAPstation starts by authenticating the SOAP request, using a variety of configurable security methods such as HTTPS certificates and SAML assertions... In addition to being able to monitor Web service traffic, the SOAPstation Admin Console provides you with the capability to set up and configure a wide variety of rules using simple wizard-based configuration screens. In fact, the entire process of setting up a managed Web service is very straightforward and can be achieved in most cases in a few minutes. The exception will be for some of the more complex rules or transformations where you may need to throw in some custom Java code or XSL transformations. SOAPstation also provides built-in facilities for adding instrumentation and logging to your managed services. For instance, you can easily set up rules for a service that allow you to e-mail an administrator when certain events take place in the message flows, such as increases in error counts or network timeouts. SOAPstation is designed to bring order to the complex decisions that are required when processing Web service requests, and provides the ability for corporate IS to scale Web service offerings across the enterprise in a organized and controlled manner. It was generally available in late 2002..." See the datasheet from Actional.

  • [February 03, 2003] "Web Ontology Language (OWL) Abstract Syntax and Semantics." Edited by Peter F. Patel-Schneider (Bell Labs Research, Lucent Technologies), Patrick Hayes (IHMC, University of West Florida), and Ian Horrocks (Department of Computer Science, University of Manchester). W3C Working Draft 3-February-2003. Version URL: Latest version URL: "This description of OWL, the Web Ontology Language being designed by the W3C Web Ontology Working Group, contains a high-level abstract syntax for both OWL DL and OWL Lite, sublanguages of OWL. A model-theoretic semantics is given to provide a formal meaning for OWL ontologies written in this abstract syntax. A model-theoretic semantics in the form of an extension to the RDF model theory is also given to provide a formal meaning for OWL ontologies as RDF graphs (OWL Full). A mapping from the abstract syntax to RDF graphs is given and the two model theories are shown to have the same consequences on OWL ontologies that can be written in the abstract syntax... This document contains two formal semantics for OWL. One of these semantics, defined in Section 3, is a direct, standard model-theoretic semantics for OWL ontologies written in the abstract syntax. The other, defined in Section 5, is a vocabulary extension of the RDF model-theoretic semantics that provides semantics for OWL ontologies in the form of RDF graphs. Two versions of this second semantics are provided, one that corresponds more closely to the direct semantics (and is thus a semantics for OWL DL) and one that can be used in cases where classes need to be treated as individuals or other situations that cannot be handled in the abstract syntax (and is thus a semantics for OWL Full). These two versions are actually very close, only differing in how they divide up the domain of discourse. Appendix A contains a proof that the direct and RDFS-compatible semantics have the same consequences on OWL ontologies that correspond to abstract OWL ontologies that separate OWL individuals, OWL classes, OWL properties, and the RDF, RDFS, and OWL structural vocabulary. For such OWL ontologies the direct model theory is authoritative and the RDFS-compatible model theory is secondary. Appendix A also contains the sketch of a proof that the entailments in the RDFS-compatible semantics for OWL Full include all the entailments in the RDFS-compatible semantics for OWL DL..." This document was produced by the W3C Web Ontology Working Group Working Draft as part of the W3C Semantic Web Activity. See: "XML and 'The Semantic Web'."

  • [February 03, 2003] "Let the Building Begin." By Patricia Daukantas. In Government Computer News (GCN) Volume 22, Number 2 (January 27, 2003). "Ready or not, Web services are coming to your agency. Small interagency groups, directed by the Federal Enterprise Architecture Program Management Office, are working feverishly on reference models to define how agencies will link and reuse applications this year and beyond. Experts in the public and private sectors say that the entire IT industry is moving toward a Web services model for on-the-fly delivery of content and services. They argue that agencies should get in on the ground floor by making their unique needs known to commercial developers of the services. Currently, more than 135 specifications for Extensible Markup Language and Web services stand at varying stages of approval by standards bodies. Learning the details of those languages and technologies 'was an enormous challenge but also an enormous delight,' said Robert Haycock, FEAPMO's acting program manager. Now on detail to the Office of Management and Budget from the Interior Department, Haycock is one of the government's principal evangelists for enterprise architecture. The CIO Council's Architecture and Infrastructure Committee is creating a subcommittee to recommend new technologies... 'Both XML and Web services are ready for prime time, but it depends what the prime-time application is,' said Brand L. Niemann, an Environmental Protection Agency computer scientist and chairman of the CIO Council's XML Web Services Working Group... A key elements of the enterprise architecture effort, Haycock said, is the use of software components. A component is a reusable chunk of code. Software developers can combine components to build apps without writing code from scratch -- a long-time goal that supports OMB's drive for agencies to share apps and the funds for them. Enterprise architecture proponents want a library of interoperable components that can be reused anywhere within government. XML, the core of Web services, is a markup language that tags content to separate it from its presentation. That separation, which is missing from the HTML that powered the rise of the Web, means multiple applications can use the same data. Emerging standards include: (1) Electronic Business XML (ebXML) is for e-commerce apps; (2) Voice Extensible Markup Language (VoiceXML) makes Internet content available to devices with voice interfaces; (3) Universal Description, Discovery and Integration (UDDI) helps organizations running Web services find one another and conduct transactions; (4) Simple Object Access Protocol lets Web services developed under one operating system run under another; (5) Web Services Description Language (WSDL) is a variant of XML for describing and finding Web services. FEAPMO this year will release draft data, technical and service-component reference models, which will describe how to wed the architectural standards with the components. The data reference model will contain metadata and an XML schema for shared data across government. 'Our intent is to have a central repository for XML schemas and metadata,' Robert Haycock said at last month's XML 2002 conference in Baltimore. The technical reference model will focus on standards and technologies -- down to specific products in some cases -- that facilitate component reuse, Haycock said. The service component reference model will identify components for building apps..." See: (1) FEAPMO website; (2) "US Federal CIO Council XML Working Group."

  • [February 03, 2003] "IBM Retools Software for Utility Push." By Martin LaMonica. In CNET (February 03, 2003). "IBM will fill in key pieces of its 'on-demand' computing initiative -- in which it will sell computing resources as if they were utilities like electricity or telephone service--with upgraded server software due to ship later this year. The computing giant said an update to its WebSphere application server software will include features to make the software more compatible with its e-business on-demand initiative, announced last year, which will let companies purchase computing power, storage or applications on an as-needed basis and receive monthly statements... Specifically, Big Blue will update its WebSphere application server with tools designed to help companies better automate complicated business processes and more easily modify existing applications. IBM will add a Web services-based business process work flow, or choreography, tool to the program, as well as a 'business rules engine' intended to have the ability to quickly reconfigure applications to meet changing needs... With growing interest among customers in flexible and cost-effective IT systems, Big Blue will face mounting competition for on-demand products and services. Hewlett-Packard has a set of utility computing products. Database software maker Oracle claims that its existing clustered database technology achieves the same goal of efficient computing resource use. Sun purchased utility computing start-ups Terraspring and Pirus Networks last year and is expected to roll out revamped software, tied to its grid initiative, during the first quarter this year. Not surprisingly, Sun takes issue with IBM's contention that layering an 'XML veneer' on the Java programming model is insufficient for a services-oriented architecture. The forthcoming J2EE 1.4 standard, due to be finalized in the first half of this year, will incorporate Web services protocols, and many Web services-based applications are written with Java, noted Ralph Galantine, Sun's group marketing manager Java Web services..."

Earlier XML Articles

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: