The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: July 30, 2002
XML Articles and Papers July 2002

XML General Articles and Papers: Surveys, Overviews, Presentations, Introductions, Announcements

Other collections with references to general and technical publications on XML:

July 2002

  • [July 31, 2002] "VoiceXML Making Web Heard In Call Centers." By Ann Bednarz and Phil Hochmuth. In Network World (July 29, 2002). "Aspect Communications this week will announce call center software that essentially will enable users to navigate Web content via voice commands. The Aspect news comes on the heels of Avaya's announcement last week of interactive voice response (IVR) software that will make data contained in corporate directories and databases available to callers via spoken commands. At the heart of both efforts is support for the latest release of VoiceXML (VXML), Version 2.0. An extension to the XML document formatting standard, VXML streamlines development of voice-driven applications for retrieving Web content. While using voice commands to retrieve information is a routine IVR task, emerging tools support more complex, speech-driven activities, such as filling out forms or retrieving product information, all in a standards compliant rather than proprietary environment. In Aspect's case, customers will be able to use the same databases, application servers and business rules to process voice self-service interactions as they do to process Web self-service transactions. The firm is building the voice-activated service features into its existing software suite, Aspect IP Contact Suite. Avaya is adding VXML capabilities to Version 9.0 of its Avaya IVR server. Previous versions offered speech-recognition features, but 9.0 is the first to embed VXML support. Adoption of standards such as VXML is just one contributor to an overall trend to increase the sophistication of IVR products, making them less dependent on menus that bury information several layers deep and better able to handle queries phrased in natural language, says Martin Prunty, president of consulting firm Contact Center Professionals..." See details in the announcements from Aspect Communications and Avaya. On VXML, see "VoiceXML Forum."

  • [July 31, 2002] "Briefing Book: Inside UDDI." By Richard Karpinski. In InternetWeek (July 31, 2002). "The original vision for the [UDDI] registry and services discovery specification was of automating global e-business by letting companies register themselves, their products and their Web services in a central directory. Trading partners and customers then would dynamically "discover" that information and cut e-business deals on the fly. The big, replicated, public registry services to drive this vision are up and running today. E-commerce may be frictionless, but it's not that frictionless. And so those public registries go mainly unused, a compelling proof of concept on the one hand but a testament to an overambitious vision on the other. That said, UDDI has emerged as a key piece of the Web services software stack, sitting about the core elements such as XML Schemas, SOAP for remote procedural calls and WSDL for defining service interfaces. Tool vendors like Microsoft, Sun, IBM, BEA Systems and others are building the ability to publish and find UDDI-registered Web services into their development tools. They've also begun shipping behind-the-firewall UDDI servers. Developers can use -- and benefit from -- these techniques today. As for the ultimate vision of UDDI, its backers remain hopeful. As for us, we'll see. Better to get started with what UDDI is good at today and worry about the grand e-business prospects later. If it's compelling enough, that day will come... Among the 'Top Features' of the UDDI Briefing Book reference list: "Step By Step: Getting Started With UDDI." References: "Universal Description, Discovery, and Integration (UDDI)."

  • [July 31, 2002] "SIL Three-letter Codes for Identifying Languages: Migrating from in-house standard to community standard." By Gary F. Simons (SIL International). ISO Reference: ISO/TC 37/SC 2/WG 1 N 94. [A paper presented at the International Workshop on Resources and Tools in Field Linguistics, LREC 2002 (26-27 May 2002, Las Palmas, Canary Islands).] "The International Organization for Standardization has published a standard for three-letter codes to identify languages (ISO 1998). Known as ISO 639-2, it provides codes for fewer than 400 languages. Thus language documentation efforts (such as ISLE, E-MELD, OLAC, and Rosetta Project) that embrace endangered languages have had to look elsewhere for language identifiers. They have turned to the most widely known and accessed reference work on language identification, the Ethnologue, now in its 14th edition. With listings for over 7,000 languages, the Ethnologue seeks to give a comprehensive accounting of all known living and recently extinct languages in the world. Other languages, such as ancient and constructed languages, are specifically outside the scope of the Ethnologue; SIL International is pleased to cooperate with the Linguist List initiative to develop standardized codes for these languages that fall outside the scope of the Ethnologue... A foundational aspect of documenting an endangered language and preserving that documentation for long-term access is identifying the language itself. The web version of the Ethnologue has become the de facto standard for identifying the more than 6,800 languages spoken in the world today. The system of three-letter codes that uniquely identify each language has been used within SIL for nearly three decades as an in-house standard, but now there is increasing demand for these codes to be used by other organizations and projects. This paper describes four changes that SIL International is implementing in order to make its set of language identification codes better meet the needs of the wider community. The changes seek to strike a balance between becoming more open while at the same time becoming more disciplined..." See: "Language Identifiers in the Markup Context." [cache]

  • [July 31, 2002] "OASIS Steps Up Web-Services Work." By [Seybold Staff]. In The Bulletin: Seybold News and Views On Electronic Publishing Volume 7, Number 44 (July 31, 2002). "OASIS, the vendor-dominated consortium that focuses on XML-related standards, has undertaken several new efforts related to XML-based Web services. First, OASIS will serve as the steward for the Universal Description, Discovery and Integration (UDDI) specification, which identifies how to catalog Web services so that they can be found online by other software programs... Second, the OASIS standards consortium recently organized a new technical committee to improve the Web-Services Security specification, which was originally published by IBM, Microsoft and Verisign. Representatives from IBM and Microsoft are co-chairs of the committee, but some two-dozen other vendors plan to participate-including Sun Microsystems, which had previously balked at joining the effort... For those who like to get their information firsthand, a forum on Web-services security sponsored by the W3C and OASIS will be held August 26 in Boston at the start of the XML-Web Services One Conference and Expo... If these standards efforts can prove to be more than vendor public-relations efforts, they will be very worthwhile. On the UDDI front, the specification is mature, but the world needs a neutral, trusted party to run the directory. On the security front, the WS-Security effort should be coordinated with the SAML and XACML efforts already well under way at OASIS, rather than duplicating the work in authentication and policy frameworks. Now that all three of the security committees are under the auspices of one consortium, we hope a unified set of security standards for Web services will emerge..." See in depth from Seybold "Web Services Start to Appreciate Security," by Hans Hartman, in Seybold Report: Analyzing Publishing Technology [ISSN: 1533-9211] Volume 2, Number 5 (June 3, 2002), pages 11-16. References: (1) "Web Services Security Specification (WS-Security)"; (2) "OASIS to Host UDDI Project Technical Work"; (3) "W3C/OASIS Forum on Security Standards for Web Services."

  • [July 30, 2002] "Adventures in High-Performance XML Persistence, Part 1. A High-Performance TCL-scripted XSLT Engine." By Cameron Laird (Vice president, Phaseit, Inc.). From IBM developerWorks, XML Zone. July 2002. ['XML storage is too sprawling a topic to offer easy answers. There's no one fastest XML database, nor fastest XML processing language. Still, it's helpful to understand the basic concepts of XML persistence so you can apply them to your specific situation. This article begins a new developerWorks series on high-performance XML by offering an explanation of common industry practices in XML persistence -- that is, storage of data beyond the lifetime of a single process.'] "You're responsible for large, mission-critical XML programs. You have dozens, or maybe thousands, of simultaneous users. Your XML pilot programs have gone well, and you've deployed more and more features. Your systems are in constant use, and response time is starting to stall. You start to wonder, 'What does it take to maximize XML performance?' The answer: You don't want to maximize your XML performance. You need to meet engineering requirements. Perhaps you need to manage scalability, or boost the responsiveness of specific applications. Don't hunt for the fastest XML storage. In that direction lie $700 hammers and the other symptoms of counter-productive obsession. Instead, learn to apply the basic concepts of XML so you can engineer the persistence needed for your own situation... The first principle of designing XML persistence is that any solution must make for a comfortable organizational fit. If your company requires use of Java technology, and a particular XML database has a poor Java binding, don't choose it. No matter how high its performance on standard benchmarks, it's likely that your co-workers will not make good use of it. Working with an unfamiliar technology will annoy them, and they're unlikely to achieve favorable results. On the other hand, suppose you work in an environment that provides a great deal of support for a database such as DB2. However well or poorly your XML content fits the DB2 persistence model, you should seriously consider DB2 storage. Sufficiently enthusiastic, well-equipped, and motivated expertise is likely to overcome modest mismatches on the technical level, as this article will show you. The principal categories of XML persistence center on these technologies: (1) Native file system, (2) Relational database management systems (RDBMS), (3) Special-purpose XML database managers, (4) Other data managers. The easiest XML storage is native: Keep XML document instances as named files in a file system. This is the most transparent and flexible persistence method, and should be your default starting point for new designs. ... No one XML persistence method is right for all scales of problem. Start with familiar technologies for your needs to store XML data. Make a clear distinction between policy requirements for transacting or storing data formatted as XML, and application-specific design requirements for data security and performance. Choose persistence methods compatible with the technologies your organization uses..." See: "XML and Databases."

  • [July 30, 2002] "Implementing the ebXML Registry/Repository. The Role of the ebXML RegRep (Registry/Repository) in an E-Business Framework." By Chaemee Kim. In XML Journal Volume 3, Issue 7 (July 2002). ['Chaemee Kim is a solution manager for KTNET, an electronic trade service provider.'] "The ebXML initiative has defined a RegRep (Registry/Repository) as providing a shared space for one or more B2B communities. With an ebXML RegRep, companies can submit, update, deprecate, or otherwise manage the parameters required to conduct electronic business. The RegRep also defines standardized APIs to access or otherwise share these parameters across trading communities. You may have heard about the benefits of a B2B business model, a concept that grew out of the electronic trading communities that were defined using EDI (via the X12 and EDIFACT standards). While B2B grew out of the traditional EDI space, many implementation requirements are missing or poorly understood. What are these missing elements? The traditional B2B business model (again, based on the EDI model) assumes that trading partners have advance knowledge of each other's e-business environments, trading protocols, and procedures. The "discovery" phase is traditionally done offline via a manual process (phone calls, legal contracts, etc.). This approach limits companies to conducting business with a relatively small community of well-known trading partners. A well-defined discovery process would enable most companies to significantly expand the size of their trading communities. The current B2B model, however, doesn't support such a process, forcing trading partner configuration to be accomplished offline. The ebXML RegRep is designed to close some of the gaps in traditional B2B business models, as B2B alone isn't enough to establish true collaborative commerce. B2B with an ebXML RegRep provides a more advanced B2B model that we call Business-RegRep-Business, or BRB. ebXML RegRep enables trade parameters to be shared among business peers, and helps to build more dynamic B2B environments based on the discovery and execution of trade agreements with ebXML-enabled trading partners... What distinguishes ebXML RegRep from UDDI in Web services? This is one of the questions most frequently asked by companies interested in both ebXML and Web services. ebXML RegRep and UDDI repository each have a different scope and purpose. An ebXML RegRep, which is more likely to be focused on content management, was designed to store and manage a wide range of electronic trading parameters. UDDI on the other hand was designed to manage the metadata associated with a Web service. In short, ebXML RegRep is the registry for B2B, while UDDI is the registry for Web services. From a UDDI perspective, the UDDI initiative can provide a loosely coupled connection model with an ebXML RegRep. An ebXML RegRep's registry service specification can be published as a SOAP interface, enabling a traditional Web service that describes how to access the ebXML RegRep using a SOAP client. From an ebXML RegRep perspective, the RegRep can include a Web services registry within an ebXML Registry Information Model (RIM). A Web services registry is realized in RIM version 2.0. The ebXML RegRep Technical Committee expanded its RIM to categorize Web services as RegRep-compatible metadata. The ebXML RegRep treats Web services as a set of metadata that B2B community members want to share with their trading partners. It is designed to support a wide variety of objects and metadata, while UDDI focuses exclusively on Web services..." See: "Electronic Business XML Initiative (ebXML)." [alt URL]

  • [July 29, 2002] XML-Signature XPath Filter 2.0. By John Boyer (PureEdge Solutions Inc.), Merlin Hughes (Baltimore Technologies Ltd.), and Joseph Reagle (W3C). W3C Candidate Recommendation 18-July-2002. Version URL: Latest version URL: The W3C XML-Signature XPath Filter 2.0 was advanced to Candidate Recommendation status on 18-July-2002. Produced by the IETF/W3C XML Signature Working Group, this Candidate Recommendation "defines an alternative to the XPath transform of the XML Signature Recommendation [XML-DSig]. The goal is to: (1) more easily specify XPath transforms and (2) more efficiently process those transforms." From the abstract: "XML Signature recommends a standard means for specifying information content to be digitally signed and for representing the resulting digital signatures in XML. Some applications require the ability to specify a subset of a given XML document as the information content to be signed. The XML Signature specification meets this requirement with the XPath transform. However, this transform can be difficult to implement efficiently with existing technologies. This specification defines a new XML Signature transform to facilitate the development of efficient document subsetting implementations that interoperate under similar performance profiles.... The specification incorporates the resolution of all last call issues. The WG considers the specification to be very stable and invites implementation feedback during this period. The specification presently has three interoperable implementations as shown in the Interoperability Report; [the WG will] try to obtain one more, but otherwise will advance after the Candidate Recommendation period after three weeks (closing 08-August-2002)." See "XML Digital Signature (Signed XML - IETF/W3C)."

  • [July 29, 2002] "Vignette Joins WS-I, Adds Web Services To Content, Commerce Mix." By Richard Karpinski. In InternetWeek (July 29, 2002). "Vignette said Monday it has joined the Web Services Interoperability Organization (WS-I), hoping to lend its expertise in content management and e-commerce to the effort to standardize an approach to emerging XML-based applications. The WS-I organization is large and growing; the group counts more than 125 companies among its members. WS-I is aiming to develop implementation tools, guidelines, best practices, and online resources to help enterprises deploy Web services in the real world... Vignette is increasingly baking Web services support into its V6 content management and commerce platform. The product currently includes support for the XML Schema, Web Services Definition Language (WSDL), Universal Description Discovery and Integration (UDDI), and Simple Object Access Protocol (SOAP) standards. V6 also spans both J2EE and .Net platforms. Vignette said it plans to join the WSBasic Profile working group, which is defining how to use standards like SOAP, WSDL, and UDDI as a foundation for Web services..." See details in the 2002-07-29 announcement "Vignette Joins Web Services Interoperability Organization. Vignette Supports Effort to Establish Greater Compatibility of Web Services Across Technologies and Applications." Other references: "Web Services Interoperability Organization (WS-I)."

  • [July 29, 2002] "Controlled Trade Markup Language (CTML)." Edited by Kathleen Yoshida (FGM Inc.). Working Draft v0.1.0.0. 25-July-2002. CTML Draft Specification v0.1.0.0. 34 pages. Prepared by members of the OASIS Controlled Trade Markup Language (CTML) Technical Committee. ['This specification defines an XML vocabulary for controlled trade activities. This version of the specification is a working draft of the committee, and as such, it is expected to change prior to adoption as an OASIS standard.'] "The purpose of the CTML TC is to develop a unified trade control vocabulary that supports an international collection of business documents (e.g., trade applications, cases, licenses, delivery verification certificates, etc.) through the extension and expansion of an existing XML vocabulary. A goal of our work will be to incorporate the best features of other XML business vocabularies and provide a clearly articulated interface to other mutually supporting specifications. The CTML specification is intended to become an international standard for controlled trade activities, and together with other XML specifications, allow industry, nongovernmental organizations, and governments to unambiguously identify the essential business and legal documents to be exchanged in particular business contexts and geographic locales (i.e., country). Furthermore, the CTML will align its vocabulary and structures with the vocabulary and structures of other OASIS libraries (like Unified Business Language, Business Transactions, and Customer Information Quality) and implement a mechanism for the generation of context-specific schemas for basic business documents and their components through the application of transformation rules to a common XML source library. The specification will be open to everyone without licensing or other fees..." Also available in Word .DOC format. Other references: "Controlled Trade Markup Language (CTML)." [cache]

  • [July 29, 2002] "Building XML Portals with Cocoon." By Carsten Ziegeler and Matthew Langham. From July 24, 2002. ['Matthew Langham, along with Cocoon expert Carsten Ziegeler, explain how to build portals using the Apache Cocoon XML framework. Langham and Ziegeler cover the basics of portal functionality, including authentication and embedding components, using the open source portal component they developed for Cocoon.'] "Cocoon is an Apache open source project originally started by Stefano Mazzocchi in 1998 because he was frustrated by the limitations HTML poses when it comes to separating content from design. He described the current Cocoon version in detail in an article for in February. Although Cocoon was originally designed as a framework for XML-XSL publishing, we felt that due to the extensibility of the architecture, it would be possible to add components that extended Cocoon so that it could be used in a wider variety of applications. Around the same time we decided to use Cocoon, we were also evaluating available commercial portal solutions for several customers. Portals used in a more commercial environment have varied requirements. Therefore it seemed ideal to extend Cocoon with components for authentication and portals -- and at the same time to retain the original strengths of the platform. The new components were originally developed as part of a commercial offering. In the middle of 2001 we installed the first Cocoon-powered portal at a financial institution in Germany. At the beginning of 2002 we then donated the components to the Cocoon project. The donation consisted of components and tools for authentication (originally called 'sunRise') and portals (originally called 'sunSpot'). Each part can be used without the other; the authentication can be used to protect certain resources on the server. In this article we will look at the Cocoon portal and authentication frameworks... The portal and authentication components were donated to the Cocoon project at the beginning of 2002, and they are already being used by several large companies as the base for Intranet based offerings. We are particularly proud of the fact that NASA and large IT companies such as BASF IT Services have seen the advantages of using an open source, XML-based solution to provide a powerful enterprise portal. Because the portal is tightly integrated into the XML publishing platform, it does not break Cocoon concepts and allows the flexibility of XML-XSL publishing to be utilized. The current portal can be thought of as version 1.0 and has been in production use for over a year. The next version of the portal aims to increase the flexibility of the system and to introduce conformance to existing and emerging portal standards..."

  • [July 29, 2002] "XML Data-Binding: Comparing Castor to .NET." By Niel Bornstein. From July 24, 2002. ['Niel Bornstein continues his comparative study of XML programming techniques in Java and .NET. As a prelude to next month's comparison of XML to database mappings, Niel compares the way that Java (in the form of Castor) and .NET handle XML data binding.'] "After the second article in this series was published, several readers said that they would like to learn the .NET way to map data from XML to a relational database management system. I'd like to show you that, but first I've got to lay some groundwork. In this article, I will show how .NET XML data binding works, while investigating the equivalent Java functionality. Java and .NET both have excellent support for data binding, and although they work in slightly different ways, each is just as valid and useful as the other. In my next article, I'll complete the exercise by mapping XML files to an RDBMS... The first side of Castor is its interface for binding XML documents to Java objects... (1) Define the mappings from XML to Java, either procedurally or through configuration files, which may themselves may either be W3C XML Schema documents or in Castor's own format; (2) Either generate Marshaller and Unmarshaller classes specific to our data (for best performance) or allow Castor to manage the marshaling and unmarshaling at runtime (for most flexibility). There are many options to customize your Castor project; for the most balanced comparison to .NET, we'll stick to W3C XML Schema and Castor's built-in runtime marshaling. Unlike many of the other Java databinding frameworks, Castor includes excellent W3C XML Schema support... So, what have we learned this time? First, that, given an W3C XML Schema, we can easily create classes that create XML and load existing XML, for both Java and C#. And, given those classes, it's relatively easy to write code that uses them to write portable data files, using XML. The fact that we used the same schema and data files in Java and C# proves once and for all that XML is a true interoperability language. The xsd tool can do some other things, too. It can generate source code in a variety of languages (Visual Basic .NET and JScript.NET, in addition to C#). It can generate a new W3C XML Schema Description for any .NET source file. Finally, it can generate a DataSet subclass, suitable for use in XML-to-RDBMS mapping. And that's where we'll pick up next time, with a comparison of JDO to ADO.NET..." See also Java and XML Data Binding, by Brett McLaughlin.

  • [July 29, 2002] "Look Ma, No Tags." By Kendall Grant Clark. From July 24, 2002. ['Kendall Clark's XML-Deviant column this week looks at YAML. Over the last few years XML has spawned several competitors: covered the SML (Simplified Markup Language) debate in some detail. Recently, another approach to marking up data has appeared in the shape of YAML, and appears to be gaining some software support.'] "YAML -- short for: YAML Ain't Markup Language; rhymes with 'camel' -- popped up on my radar screen last week as a result of an interesting thread on comp.lang.python about the uses and abuses of XML, a conversation which I commend to your attention on its own merits. In rummaging around for a plain, concise description of YAML, I kept stubbing my toe on a felt need to define it by referring to XML in some way. That was a mistake. YAML stands on its own very nicely, even if its most immediate point of contrast is XML. In other words, if there were no XML, there could still be a YAML, but it would have a different public face. If the XML world tends to get divided into data and documents, a distinction which is probably more pedagogically useful than it is necessarily true, YAML corresponds more to the data part of XML than to the document part. As the YAML specification puts it, 'YAML is more closely targeted at messaging and native data structures' than at structured documents. Accordingly, my plain, concise description of YAML is that it's a processing model and a wiki-markup-esque way to represent relatively arbitrary, high-level language data structures... the two leading selling points of YAML over XML are that it's more lightweight, and that it uses native processing models and data structures. The most serious YAML detractions are that it isn't XML, and it isn't nearly as ubiquitous as XML; though YAML is very well supported in Perl, the support in Python, Java, and Ruby is maturing, and there are rumors of a forthcoming libyaml in C, too. It bears repeating that ubiquity of tool support is not an absolute value; it is context-dependent and goal-specific. You may be able to sacrifice it for the sake of using YAML and securing its virtues, depending on what you need to do and where you need to do it... There is a lot to YAML. The specification fits in one HTML document, but it is neither short nor simplistic. For example, if you're interested in YAML but circumstances prevent you from moving away from XML all at once or altogether, you might want to look at YAXML ['XML Binding for YAML'], which is the YAML conceptual model with XML's familiar syntax bolted on..."

  • [July 29, 2002] "OASIS Swallows" By Tom Sullivan. In InfoWorld (July 29, 2002). "Along with a new address, the UDDI specification is undergoing something of a facelift. Standards body OASIS on Tuesday [2002-07-30] will subsume, the consortium driving the UDDI (Universal Description, Discovery, and Integration) technology that forms the foundation of Web services directories. OASIS on Tuesday also will formally introduce the third iteration of UDDI, with a new focus on internal corporate deployments. When it was first introduced, UDDI was hailed as an online global yellow pages that would indiscriminately connect businesses, regardless of infrastructure... in the nearly two years since it was first announced, only a handful of public UDDI-based directories have been born. What's more, those are hosted by Microsoft and IBM -- two of UDDI's three founders -- as well as SAP, and NTT Communications in Japan. Ariba, the third founding member of UDDI, opted not to follow through with its plans to host a public directory... Analysts have been saying nearly since its inception that UDDI would first gain traction inside the firewall and, eventually, as Web services more commonly cross firewalls and span multiple organizations, UDDI would follow suit. 'In the past two years, there has really been a shift in the way we think about UDDI,' said Chris Kurt, program manager and a group product manager for Web services standards at Redmond, Wash.-based Microsoft. 'We're seeing more and more internal adoptions of UDDI.' To that end, UDDI 3 includes several improvements that tune it for internal deployments. Version 3 makes it easier for applications to exchange information in clustered configurations, increases security, improves searching capabilities, and includes a set of APIs that enable notifications and updates to be sent out when changes occur. Kurt said that OASIS, and the UDDI folks within, have not given up on the idea of UDDI for dynamically discovering and connecting Web services to each other. Instead, they have come to believe that the original vision will be realized as consumer Web services emerge..." On UDDI version 3, see the announcement of July 03, 2002: "UDDI Working Group Publishes UDDI Version 3.0 Specification." General references: "Universal Description, Discovery, and Integration (UDDI)."

  • [July 29, 2002] "UDDI Gets Needed Enterprise Upgrade." By Richard Karpinski. In InternetWeek (July 29, 2002). "UDDI, long thought to be one of the main building blocks of Web services, got some much-needed improvements that could help drive its use among enterprise developers. The Universal Description, Discovery and Integration project aims to create a registry in which developers can sign in their Web services and have customers or trading partners discover and consume them. UDDI has gotten off to a slow start. That's at least in part because Web services are still in their infancy, so a registry that tracks them may be a bit premature. It's also because early versions of the UDDI specs were more focused on supporting large, replicated public versions of a UDDI registry. To right that course, the group on Monday published version 3 of the UDDI specs. It also moved its work into the OASIS standards body, a group that is emerging as a key holding place for an array of Web services standards. UDDI v3 includes a variety of new features that should prove appealing to enterprises, said George Zigelow of IBM, head of the operational group that manages the public UDDI registries. For instance, version 3 includes improved security, including support for XML-based digital signatures. That should make it more appealing for companies to publish interfaces to mission-critical Web services. The new spec also makes it easier to find services described using the Web Services Description Language (WSDL)... The public UDDI nodes -- run by IBM, Microsoft, SAP, and soon NTT -- just went to version 2 of UDDI last week. IBM's Zigelow said those registries would begin to implement version 3 within 90 days.." On UDDI version 3, see the announcement of July 03, 2002: "UDDI Working Group Publishes UDDI Version 3.0 Specification." General references: "Universal Description, Discovery, and Integration (UDDI)."

  • [July 29, 2002] "Web Services Specification Gets Makeover." By Wylie Wong. In ZDNet News (July 29, 2002). "A Web services directory effort spawned by Microsoft, IBM and Ariba has been updated before its submission to an industry standards body. The specification, called Universal Description, Discovery and Integration (UDDI), identifies and catalogs Web services so they can be easily found online. A consortium of 220 companies is releasing a third version of the specification on Tuesday and submitting the technology to a standards body known as the Organization for the Advancement of Structured Information Standards (OASIS). Microsoft, IBM and Ariba proposed the directory technology nearly two years ago as sort of a Yahoo for business services. The idea was to build an online database using UDDI that would help companies find Web-based software services that could be used as part of their own business systems. For example, an e-commerce site could use the directory to search for a business that handles credit card transactions. If a match were found, all the elements of the transaction -- even the price and payment -- could be handled electronically. But UDDI-based public Web-services directories created by IBM, Microsoft, SAP and Hewlett-Packard have been slow to catch on. Instead, the specification is beginning to find a home in big companies as a way to build directories for internal Web services projects, allowing the companies to better catalog services and communicate across departments. Recognizing that trend, Microsoft, Sun Microsystems, Oracle, IBM and others have been building private UDDI directory capabilities into their software products... OASIS CEO Patrick Gannon said the standards body had previously built directory technology similar to UDDI as part of its ebXML work. But Gannon said UDDI and ebXML do not compete and can co-exist and work together. 'They solve different problems,' Gannon said. 'We can help jointly promote and position them.' Steve Holbrook, IBM's program director for emerging e-business standards, said UDDI was built for businesses that want to use Web services, while the ebXML registry technology was built for computer-science types, for example, as a place to store XML vocabularies used by various industries. XML (Extensible Markup Language) is a standard for exchanging data. The latest version of UDDI will offer several new features, including improved security and the ability for businesses to receive automatic notifications when UDDI registries are changed or updated, Holbrook said. The biggest improvement, he said, is a technology called 'multi-registry topologies,' which allow businesses using an internal UDDI directory to migrate their directory to a public or industry-specific UDDI directory..." On UDDI version 3, see the announcement of July 03, 2002: "UDDI Working Group Publishes UDDI Version 3.0 Specification." General references: "Universal Description, Discovery, and Integration (UDDI)."

  • [July 29, 2002] "Closing In On 3D Web Standards." By Paul Festa. In CNET (July 23, 2002). "A group pushing for industry standards for 3D on the Web released its final working draft of a key specification, bringing the technology one step closer to international standardization. The Web3D Consortium (W3DC) made its draft of Extensible 3D (X3D) and an accompanying software development kit available for download and solicited comment on the specification. The technology is a descendant, expressed in XML (Extensible Markup Language), of the pioneering but ultimately unsuccessful Virtual Reality Modeling Language (VRML). The consortium plans to submit the specification to the International Organization for Standardization (ISO) in October [2002]. X3D could follow VRML 97 in becoming an ISO standard by 2004. The release of the final X3D draft, designed for entertainment, educational and e-commerce uses, comes as the consortium launches its second major initiative, the more industrially focused Computer Aided Design (CAD) 3D working group..." See also the news item of 2002-07-29: "Web3D Consortium Publishes X3D Final Working Draft."

  • [July 29, 2002] "Introduction to RELAX NG." By Michael Classen. In (July 23, 2002). ['Whether you prefer compact or full size definitions, one recent schema specification has you covered. Michael Classen introduces you to both the short and long forms of RELAX NG syntax.'] "In the last installment we discussed the different approaches to schema definition put forward by the W3C and OASIS. More specifically, we followed the criticism surrounding XML Schema, and looked at some improvements offered in the alternative, RELAX NG. Today we'll explain the basics of RELAX NG by example..." See earlier ""Schema Wars: XML Schema vs. RELAX NG." References: (1) "RELAX NG"; (2) "XML Schemas."

  • [July 27, 2002] "Data Modeling using XML Schemas." By Murali Mani (Computer Science Dept, UCLA). Presentation to be given Wednesday, August 7, 2002 as a Nocturne at the Extreme Markup Languages Conference 2002. "XML appears to have the potential to make significant impact on database applications, and XML is already being used in several database applications. One of the main reasons for this is the 'superiority' of XML schemas for data modeling - recursion and union types are easily specified using XML schemas. In order to do data modeling effectively, we should be study it systematically. A data model has three constituents to it - structural specification, constraint specification, and operators for manipulating and retrieving the data. Regular tree grammar theory has established itself as the basis for structural specification for XML schemas. Constraint specification is still being studied, and we have approaches such as 'path-based constraint specification' and 'type-based constraint specification', with strong indications of type-based constraint specification as a very suitable candidate. Operators are available as part of XPath, XSLT, XQuery etc. In this talk, we would like to mention about the two ways of specifying contraints - path-based and type-based. Then we would like to describe how we can specify entities and relationships using regular tree grammar theory, and with type-based constraint specification. Furthermore, we would like to talk about an issue which is attracting attention of late -- subtyping required for XML processing. There are two techniques for subtyping in XML Schemas -- explicit as in W3C XML Schema or implicit as in XDuce. The main results here are: (a) The two subtyping schemes are incompatible with each other, and (b) There are open and interesting issues in doing implicit subtyping..." Extreme 2002 Conference will be held August 4 - 9, 2002 in Montréal. For general references on XML schema languages, see "XML Schemas."

  • [July 26, 2002] " To Become OASIS Member." By Elizabeth Montalbano. In Computer Reseller News (CRN) (July 26, 2002). " will become a member of the OASIS technology standards consortium next week, officials for both organizations confirmed Friday. Officials said OASIS will publish the third version of the UDDI specification next week and that all further developments of the specifications will fall under OASIS's jurisdiction. Currently, Microsoft, IBM and Hewlett-Packard offer UDDI registries, and SAP and NTT Docomo have registries in development. It has not yet been decided whether the registries also will fall under OASIS control, said Chris Kurt, program manager for and a Microsoft official. Kurt said that OASIS taking the reins of the UDDI technology should give the industry assurance that UDDI is here to stay... The next step for UDDI 3.0 will be for OASIS members to form a technical committee charter around the technology, said Patrick Gannon, CEO and president of OASIS. Once that is passed, the first committee meeting will happen within 45 days, he said. Another Web services standard in OASIS, ebXML, also features a standard registry for Web services. Gannon said UDDI's inclusion in OASIS will not affect that work... was first launched as a joint initiative by Microsoft, IBM and Ariba in September 2000 to provide a standard way to identify trading partners' Web services. The group recently submitted its third version for review. Since UDDI's inception, a wide array of companies, among them Microsoft, IBM, Sun Microsystems and BEA Systems, have built and sold products that support UDDI. It is one of several XML-based standards for Web services, including WSDL and SOAP, that solution providers say will be key for the broad adoption of Web services in the industry..." See: "Universal Description, Discovery, and Integration (UDDI)."

  • [July 26, 2002] "UDDI V.3 Coming Soon (To a Web Service Near You)." By [ComputerWire Staff]. In The Register (July 26, 2002). "... UDDI has been adopted by the Organization for the Advancement of Structured Information Standards (OASIS) for ratification as an independent standard. That announcement will also be made Tuesday [2002-07-30]. Backing from OASIS should ensure broader industry input into future versions of UDDI and potentially encourage broader up-take. UDDI has been the mandate of, whose members include American Express, BEA Systems Inc, Boeing, Cisco Systems Inc, Ford Motor Company and Fujitsu. UDDI was launched by Microsoft, IBM and Ariba Inc to describe and register businesses using XML, in an online database. Since its launch in September 2000 - the apex of the b2b, b2c and online market places boom - UDDI has seen mixed fortunes... This week public UDDI Business Registries from IBM, Microsoft and SAP AG adopted UDDI 2.0. NTT Communications is expected to launch its own registry this fall. Doubts remain, though. Hewlett-Packard Co signed an agreement to operate a registry, but that registry's future is in doubt after HP pulled out of Java middleware and web services. HP failed to return Computerwire calls for comment. Meanwhile, some early adopters are bypassing UDDI entirely. Deloitte Consulting principal and e-business chief technology officer Michael DeBellis reports early adopters are hard-coding together web services, by-passing UDDI and defeating the vision. UDDI faces an additional threat from Lightweight Directory Access Protocol (LDAP). Anecdotal evidence says some early adopters are putting web services data into directories based on LDAP, a well-established directory technology, instead of UDDI. Observers feel UDDI is in danger of being sidelined by events. 'We are seeing very few public UDDI registries. Where it is being used is behind the firewall,' DeBellis said.'s answer is UDDI 3.0. The specification improves interoperability and replication between registries behind those firewalls. Siva Darivemula, strategic initiatives marketing manager for IBM's WebSphere marketing, said 3.0 is richer and contains more detailed descriptions than previous specifications..." See "UDDI Working Group Publishes UDDI Version 3.0 Specification" and the reference document "Universal Description, Discovery, and Integration (UDDI)."

  • [July 26, 2002] "The Evolution of UDDI." By The Stencil Group, Inc. White Paper. July 19, 2002. 15 pages. "Although many aspects throughout the UDDI specification have matured in the version 3.0 release, the chief architectural change is the concept of 'registry interaction.' This shift reflects the increasing recognition that UDDI is one element of a larger set of web services technologies that support the design and operations of myriad software applications within and among business organizations. In short, just as each enterprise application embodies the specific characteristics of the business process it supports, so should the enabling technologies like UDDI support a variety of infrastructural permutations. For UDDI, this business requirement dictated an increased emphasis on providing a means to define the relationships among a variety of UDDI registries, not simply access to one, public registry of business services, the UBR. Although the UDDI specification included from the start concepts like replication and distribution among server peers, earlier definitions of the standard did not fully address the nuts-and-bolts required for the more sophisticated, hierarchical model now dictated... The Version 3 specification addresses several features that support an emphasis on registry interaction. While relatively little of the existing features have changed, a handful of key functional concepts have been added or expanded to accommodate the variety of new taxonomies. Some of the most important issues addressed in the Version 3 specification include: (1) Registration key generation and management; (2) Registration subscription API set; (3) XML digital signatures... Most elements of a registry record optionally may be signed using the DSIG specification maintained by the W3C. Thus, while the specification does not define specific policies around security and authorization, it does provide the means for specific implementations to provide for these needs. The primary benefit of digital signatures is to ensure that data has not been altered since it was signed and published, that ownership of a particular registry entity can be validated, and confidence that data transferred among registries can be assured..." See "UDDI Working Group Publishes UDDI Version 3.0 Specification" and the reference document "Universal Description, Discovery, and Integration (UDDI)." [cache]

  • [July 26, 2002] "XML No Magic Bullet." By Dave Kearns. In Network World (July 24, 2002). "If there's one thing attendees should have taken away from last week's Catalyst conference, it's the notion that XML is not the magic bullet for identity management, Web services or any other technology initiative. Burton Group CEO Jamie Lewis tried to emphasize this point is his closing remarks, and I hope everyone was listening when he said, 'Products that support XML won't interoperate automagically.' XML is, at its base, a listing of ordered pairs. One of each pair is a descriptor or identifier, while the second element is a value. The identifiers are defined in the syntax, and there's an infinitely extensible list of them available to the person coding the application... Long ago I was involved with instituting Electronic Data Interchange (EDI) in my company... EDI was not an extensible language the way XML is. Each ordered pair, each new document had to be approved by the governing standards body, and this was only done after long deliberation to remove any ambiguity from the definition. The deliberation was also needed to determine which elements were required to be used in the document and which were optional...before two enterprises (called 'trading partners') could begin to exchange EDI documents, such as a purchase order, a document called a Trading Partner Agreement (TPA) had to be drawn up. In it, each side specified the documents and versions they would use, as well as which of the optional elements. But even further, each party would specify in excruciating detail exactly what sort of values would show up in each ordered pair - even though the standard supposedly defined them. Negotiating the TPA could easily take many months because the lawyers had to get involved to determine what would happen in each instance that the rules laid out in the TPA weren't followed... XML was supposed to free us from all of that negotiation. Instead, though, it requires a whole new layer of negotiation as documents are constructed in an ad hoc manner between trading partners..."

  • [July 24, 2002] "Networking With Coworkers. Turn your office phone network into an application platform with VoIP and XML." By Veronika Megler (Consulting IT Architect, IBM eServer pSeries Solutions Development). From IBM developerWorks, Wireless. July 2002. ['Wireless developers have long been using XML-derived technologies to build applications for handheld devices with limited interface and storage capabilities. In this article, Veronika Megler shows you a new arena for those skills: the office telephone handset. With improvements in Voice over IP (VoIP) technologies, many enterprises will soon be using the phone that sits on an employee's desk as an application platform. Here, you'll walk through an example and see how you can use XML to build simple and useful small-footprint applications.'] " The Cisco 7940 and 7960 IP phones include a minimal XML browser. This browser understands a subset of the HTTP protocol and a very small number of predefined XML tags. These tags can, with a little ingenuity, be used to build new applications or new interfaces to existing applications. The phones user can interact with the display using the phone keypad and the additional buttons provided. Here are the functions that can be performed by the set of XML tags that the phones understand: [1] Show a menu and select an item from it (<CiscoIPPhoneMenu>); [2] Show some text (<CiscoIPPhoneText>); [3] Accept input (<CiscoIPPhoneInput>); [4] Show phone directory entries, and allow the user to call a number associated with one of those entries (<CiscoIPPhoneDirectory>); [5] Show a picture (<CiscoIPPhoneImage>); [6] Show a menu, built using graphics (<CiscoIPPhoneGraphicMenu>) Cisco provides a document, 'Cisco IP Phone Services Application Development Notes that describes these XML tags in detail... Note that the browser provides no scripting language and does not maintain state. It does not recognize <META> tags, and so cannot apply stylesheets to create the required XML. Therefore, the server must provide the appropriate XML tags. Only the HTTP GET method can be used; POST is not available..." See also the DTMF FAQ - Telephone Tone Dialing chips V1.20."

  • [July 24, 2002] "Locally Linked Infosets." By Francis Norton. Original Posting: 2002-07-04. Referenced on W3C's list ''. The aim of this note is to propose a minimally invasive and maximally flexible mechanism for accessing Post Schema Validation Information. It was partly inspired by proposals from Rick Jelliffe (Schemachine) and Eric van del Vlist (xvif)... The basic idea is to make Post Validation information available through a small API rather than through structural changes to the infoset. In order to avoid schema engine bias, and to keep the specification as simple as possible, LLI does not include a general specification of Post Validation Information structure, it simply requires Validation results to be returned as a node-set, in the same way as the XSLT document() function. The implementors or designers of individual schema engines would be responsible for specifying or agreeing how their result data should be represented. Using Locally Linked Infosets provides a framework for validation processing which permits sequential or parallel application of relevent schema using suitable schema engines to appropriate source nodes. It offers flexibility and simplicity by avoiding the providing a loosely coupled linking between data and meta data. Adopting this proposal would make it possible to access post validation metadata from within tools such as XSLT or the DOM at - I believe - a reasonably low cost of implementation, would not discriminate against non-XSDL schema engines, and would make the resulting metadata simple to comprehend, use, and serialise..."

  • [July 24, 2002] "Schemachine. A Framework For Modular Validation of XML Documents." By Rick Jelliffe. June 21, 2002. [See previous entry. "This note specifies a possible framework for supporting modular XML validation. It has no official status whatsover. It is for discussion purposes only. Review comments are welcome... This has been developed as a strawman for the ISO DSDL effort. For another strawman using a different basis, see Eric van der Vlist's Xml Validation Interoperability Framework (xvif)..."] "The strawman has the following features: (1) based on XML Pipeline structures, but with rearrangement and renaming, (2) embedded in Schematron-like superstructure with titles and phases, (3) a minimal implementation is possible, where all validators and translators are command- line executable programs, and the framework document is translated into BAT files or Bourne shell scripts (i.e., validators etc. are treated as black boxes) , (4) the purpose is validation rather than declarative description per se. (In particular, the further down a transformation chain that data gets, the more difficult it will be to tie the effect of a schema to the original document. ) (5) this framework supports both validation of explicit structure and validation of complex data values. It leaves issues of simple datatyping to particular validators, (6) validation is a tree of processes, (7) supports inband signalling (@exclude) and out-of-band signalling ()@haltOnFail)..." See: "Document Schema Definition Language (DSDL)."

  • [July 24, 2002] "Public Nodes Implement UDDI V2 Specification." By [InternetWeek Staff]. In InternetWeek (July 24, 2002). "The Universal Description, Discovery, and Integration (UDDI) project said Wednesday [2002-07-24] that the public nodes making up the UDDI Business Registry are now compliant with Version 2, the latest, of the UDDI specification. UDDI defines a uniform way for businesses to describe their services, find other companies' services, and understand how to conduct e-commerce with another business. With the added support for V2, UBR is now a multi-version UDDI registry with support for V1 and V2. More than 10,000 businesses have registered with the three public UBR nodes, as have 4,000 individual Web services, with all registered data being replicated between the UBR nodes. This means services registered in one node can be discovered in others. The number of public node operators is now at three with a fourth to make its node publicly available this fall..." See (1) the announcement, " Announces an Enhanced UDDI Business Registry. Multi-version UDDI Business Registry Goes Live."; (2) "Universal Description, Discovery, and Integration (UDDI)."

  • [July 23, 2002] "OASIS Forms WS-Security Committee." By Brian Fonseca. In InfoWorld (July 23, 2002). "Microsoft and IBM moved one step closer to turning their security specification into a standard on Tuesday. Clearing a significant hurdle for the WS-Security standard to gain recognition as a trusted means for applying security to Web services, standards body OASIS (Organization for the Advancement of Structure Information Standards) formed a technical committee to give vendors a crack at the immature specification. First published in April as part of a working partnership between Microsoft, IBM, and VeriSign, the WS-Security specification defines a standard set of SOAP extensions, or message headers, which can be used to set and unify multiple security models, mechanisms, and technology -- such as encryption and digital signatures for instance -- onto Web services applications which traverse the Internet. Aside from an initial WS-Security road map, the trio also proposed specifications yet to come that address a variety of other security, policy, messaging, and trust issues associated with Web services security. They include WS-Policy, WS-Trust, WS-Privacy, WS-Secure Conversation, WS-Federation, and WS-Authorization..." See the news item of 2002-07-23.

  • [July 23, 2002] "XML and Bibliographic Data: the TVS (Transport, Validation and Services) Model." By Joaquim Ramos de Carvalho (IHTI Faculdade de Letras, Universidade de Coimbra, Portugal) and Maria Inês Cordeiro (Art Library, Calouste Gulbenkian Foundation, Portugal). Paper prepared for the 68th IFLA General Conference and Council 'Libraries for Life: Democracy, Diversity, Delivery', August 18-24, 2002, Glasgow, Scotland. 13 pages, with 44 references. "This paper discusses the role of XML in library information systems at three major levels: as a representation language that enables the transport of bibliographic data in a way that is technologically independent and universally understood across systems and domains; as a language that enables the specification of complex validation rules according to a particular data format such as MARC; and, finally, as a language that enables the description of services through which such data can be exploited in alternative modes that overcome the limitations of the classical client-server database services. The key point of this paper is that by specifying requirements for XML usage at these three levels, in an articulated but distinct way, a much needed clarification of this area can be achieved. The authors conclude by stressing the importance of advancing the use of XML in the real practice of bibliographic services, in order to improve the interoperable capabilities of existing bibliographic data assets and to advance the WWW integration of bibliographic systems on a sound basis... By 'transport format' we mean an XML format designed to take a role similar to that of ISO 2709. Its purpose is to allow the efficient transport of bibliographic data. Like ISO 2709 such a format contains the necessary information for representing the morphological structure of the MARC record, i.e., without aiming at validation of the complete syntax/semantics of the MARC format, but rather mapping directly to the MARC record main structural levels. ISO 2709 was modelled to the needs of the technological environment of its time; an XML equivalent for the current technological context must target the interoperability paradigms of today: Web services. So far, most of the approaches to encoding bibliographic records in XML have been based on the assumption that the use of XML would imply the expression of the whole syntax/semantics of MARC, enforcing validation. Not only this approach has proved difficult (very complex and long DTDs or schema) but also it does not facilitate the transport and reuse of data in practical applications. This is because in such a way only valid records can be represented, and also because it is difficult to rebuild MARC data from the very complex MARC XML records generated in such a model..." From a posting: "A web page with various examples and sample code is available; the page proposes a mapping that is targeted to Web Services development. Sample web services are available at the National Library of Portugal, allowing experimental access to over one million records..." See "MARC (MAchine Readable Cataloging) and SGML/XML." [cache]

  • [July 23, 2002] "IBM Details Next Version of DB2 Database, Ships Beta." By Richard Karpinski. In InternetWeek (July 23, 2002). "With increased Web services support and improved management tools, IBM on Monday released a general beta of its DB2 database it hopes represents the next weapon in its battle with database rivals Oracle and Microsoft... Standards support for Web services is building in DB2. The database supports the basics -- such as XML schemas and Simple Object Access Protocol (SOAP) -- as well as SQL-X, its own effort to link SQL and XML in what Jones calls a 'semi-standard' way. Overall, IBM has extended SQL in more than 100 was to better support XML-style content. DB2 8.0 also includes automated, built-in support for XML transformations that programmers typically are required to write, enabling XML documents to be viewed on a Web browser. The ultimate way for dealing with XML in databases, however, is via proposed standard X-Query, which will not be supported in DB2 8.0. IBM has demonstrated support for X-Query in a prototype of a future version of DB2. V8.0 features new 'multi-dimensional clustering' capabilities that enable customers to reorganize information in one easy step, so that it can be retrieved quickly, from any view the customer may require. Also on tap are new online utilities to perform tasks such as table reorganization, index maintenance, and database loads online. DB2 8.0 is available now as a free beta..."

  • [July 22, 2002] "Keeping Web Services Royalty-Free." By Anne Chen. In eWEEK (July 22, 2002). "Tim Berners-Lee, director of the World Wide Web Consortium, opened The Open Group Conference ['Boundaryless Information Flow: The Role of Web Services'] on Monday by asking his colleagues to preserve the universality and openness of the Web as they build Web services as the foundation for the future of the Internet. And, said Berners-Lee, a key to preserving openness on the Web is to keep standard Web protocols patent and royalty-free. 'Remember that the ways to use Web services will not only be wide, but will be the base for all kinds of things people will build for the Web in the future,' Berners-Lee said. 'We have to be generic in design because Web services, if done well, will be flexible and apply to everything from mainframes to cell phones.' [...] Berners-Lee said maintaining an open, royalty-free approach will be important for both Web services and semantic Web standards. (Web services standards like SOAP and UDDI focus on how online services are found and how they interact, while semantic Web standards include industry-specific markup languages that help define the nature and behavior of objects online.) Although the two sets of standards are currently being pursued in separate, parallel tracks, Berners-Lee predicted that the semantic Web will become increasingly important as Web services mature. As such, the future of the Internet, he said, will revolve around Web services, voice routing, and the semantic Web. Berners-Lee said that, in order for all or any of those capabilities to see market growth, the base standards need to be patent- free. How that will happen is an issue the W3C has been grappling with over the past year... 'The W3C is working on a process so people used to using patents to protect themselves can sit around a table and come to an agreement where no one will charge royalties regardless of whether they have a patent...' For the conference The Open Group partnered with organizations including the W3C, OASIS (Organization for the Advancement of Structured Information Standards), OMG (Open Management Group), OGC (Open GIS Consortium), OAG (Open Applications Group) and DMTF (Distributed Management Task Force) to address the future and the development of Web services and associated standards." For background on W3C's RF Patent policy, see the document on Patents and Open Standards.

  • [July 20, 2002] "XMLTangle - Literate Programming in XML." By Jonathan Bartlett. July 2002. "Literate Programming is a style of programming in which the programmer writes an essay instead of a program. The essay's code fragments are then merged together to form a full program which can be compiled or interpretted. This article is a literate program designed to perform this task with XML documents. Donald Knuth's Literate Programming is a wonderful system for writing programs which are understandable and maintainable. It allows the programmer to not just communicate to the computer, but also communicate the ideas behind the program to current and future programmers. This idea has not caught on, but I believe it is still a worthy goal. The current literate programming tools are problematic, however. They are still too wedded to individual programming languages and document formats. This program is a version of the tangle program which has the following features: (1) Works with any programming language (2) Uses XML as the documentation language (3) Is not tied to any specific DTD - instead it relies on processing instructions to perform its tasks. It does not include every feature of literate programming - specifically it does not include any macro facility. Originally this program was written in C, only worked with the DocBook DTD, and only had a very primitive subset of the literate programming paradigm. Specifically, the code could only be broken up into files - it was not possible to include named code fragments which would be defined elsewhere - you could only append to files. This version is written in Python and captures much more of the literate paradigm... See: (1) the SourceForge XMLTangle project page; (2) "SGML/XML and Literate Programming."

  • [July 20, 2002] "Financial Products Markup Language (FPML) Messaging Structure." FPML Technical Note. Version 1-0. April 09, 2002. Version URI: "The FpML TaskForce, a working group with members from all other working groups, was formed in January 2002 to address and proposed recommendations on issues that cut across all working groups. This work was concluded in April 2002 and the recommendations incorporated into the FpML 3.0 Working Draft. One of the items to address was the messaging structure required within FpML...This technical note documents the work done on the messaging framework by the Taskforce and is intended to provide the starting point for the work of a future Messaging Working Group... The workflow model should ideally be documented in the form of UML sequence diagrams, with accompanying text. The Task Force felt that this was the best vehicle for analysing, agreeing and publicising the overall workflow structure. Detailed definition of the document schema would follow from the workflow model. The workflow model and the resultant schema structure should take into account the requirements of both business-to-business (B2B) and internal application-to-application (A2A) usage... As well as covering the pre- and post-trade workflow, the model should address modification and cancellation events at each stage of the life cycle. Some form of message header will be required. This will pertain to the FpML document itself, and will be independent of any trade-specific header information. ... The following three diagrams illustrate the exchanges involved in three standard workflow scenarios: requesting a price, making an order, and seeking a valuation of a position or portfolio..." Section 4 provides the reworked experimental DTDs. See "Financial Products Markup Language (FpML)." [Source .DOC]

  • [July 20, 2002] "Keeping Pace With James Clark. An interview (and analysis) with the leading authority on markup languages." By Uche Ogbuji (Principal Consultant, Fourthought, Inc. From IBM developerWorks, XML Zone. July 2002. Note: a comment was heard from Uche Ogbuji on XML-DEV to the effect that "The interview [with James Clark] was actually conducted several months ago. There were some editorial delays. I think it's still timely..." ['James Clark is arguably the most accomplished developer in the world of markup languages. In his distinguished career of contributing to both SGML and XML, he has served on standards bodies, provided important practical perspectives on where markup meets traditional code, and most importantly, written many of the programs that have moved XML (and SGML before it) from the world of abstract speculation into hard practicality. In this article, Uche Ogbuji interviews James Clark, concentrating on a discussion of practical developments, current and future, in the world of XML. The author also provides his own analysis of the issues raised.'] "In December of 2001, James Clark was awarded the inaugural XML Cup by the XML 2001 conference committee. This award was in honor of his many accomplishments for the XML community, and no one was in doubt of his deserving it. If you have done anything at all with markup languages, you have probably used code written by James Clark... Notably, all this work is open source, and has been since before open source came into high fashion. Clark received the award for his recent contributions as well. In 2001, he pioneered TREX, an alternative XML schema language which has since merged with the RELAX language, to form RELAX NG. The latter is proving extremely popular with developers despite competition from the W3C-sanctioned schema definition language. And to prove his grasp of XML's future, Clark followed his award with a speech on the five technical challenges currently facing XML. In this speech, he highlighted many technical issues that have caused much grumbling among developers, and he struck a cautionary note about some XML developments that might cause technical difficulties down the road... This article is based on an interview with James Clark, in which I followed up on these technical challenges. The questions and answers are presented, and in some cases, they are followed by my own explanation and analysis. Readers should be very familiar with XML and reasonably familiar with the various XML processing technologies..." See "James Clark First Recipient of the IDEAlliance XML Cup Award." [Article also in PDF format.]

  • [July 20, 2002] "The Essence of XML." By Jérôme Siméon (Bell Laboratories) and Philip Wadler (Avaya Labs). Invited paper prepared for presentation at the Sixth International Symposium on Functional and Logic Programming (FLOPS 2002), University of Aizu, Aizu, Japan, September 15-17, 2002. 14 pages, with 22 references. Referenced on Philip Wadler's XML page. "The World-Wide Web Consortium (W3C) promotes XML and related standards, including XML Schema, XQuery, and XPath. This paper describes a formalization XML Schema. A formal semantics based on these ideas is part of the official XQuery and XPath specification, one of the first uses of formal methods by a standards body. XML Schema features both named and structural types, with structure based on tree grammars. While structural types and matching have been studied in other work (notably XDuce, Relax NG, and a previous formalization of XML Schema), this is the first work to study the relation between named types and structural types, and the relation between matching and validation. The dichotomy between names and structures is not quite so stark as at first it might appear. Many languages use combinations of named and structural typing. For instance, in ML record types are purely structural, but two types declared with 'datatype' are distinct, even if they have the same structure. Further, relations between names always imply corresponding relations between structures. For instance, in Java if one class is declared to extend another then the first class always has a structure that extends the second. Conversely, structural relations depend upon names. For instance, names are used to identify the fields of a record... Our aim is to model XML and Schema as they exist -- we do not claim that these are the best possible designs. Indeed, we would argue that XML and Schema have several shortcomings. First, we would argue that a data representation should explicitly distinguish, say, integers from strings, rather than to infer which is which by validation against a Schema. (This is one of the many ways in which Lisp S-expressions are superior to XML.) Second, while derivation by extension in Schema superficially resembles subclassing in object-oriented programming, in fact there are profound differences. In languages such as Java, one can typecheck code for a class without knowing all subclasses of that class (this supports separate compilation). But in XML Schema, one cannot validate against a type without knowing all types that derive by extension from that type (and hence separate compilation is problematic). Nonetheless, XML and Schema are widely used standards, and there is value in modeling these standards. In particular, such models may: (i) improve our understanding of exactly what is mandated by the standard, (ii) help implementors create conforming implementations, and (iii) suggest how to improve the standards..." See also the previous/preliminary version. General references: "XML Schemas." [cache]

  • [July 20, 2002] "OMG Models for Web Services." By Carolyn A. April. In InfoWorld (July 17, 2002). "Standards stalwart Object Management Group (OMG) is accelerating its drive for a modeling-based approach to enterprise collaboration with a recent overture aimed at Web services. The OMG is currently developing a standard way to use mapping to connect Web services to its ECA (Enterprise Collaboration Architecture), a framework for modeling complex business processes that tie together systems, customers, and partners electronically. ECA is based on UML (Universal Modeling Language) and is tools-agnostic in terms of development. Essentially, the ECA gives developers a way to create meta-models that define common ways for representing shared business processes, such as agreeing on a sales contract, as well as the multi-formatted data behind those processes, according to OMG officials. It also defines a common way to store this information, including within a UDDI (Universal Description, Discovery, and Integration) directory or a meta-object repository. By defining and storing business collaborations as models, companies are provided immunity to the ongoing evolutions of underlying middleware, data formats, and other infrastructure, said Jon Siegel, vice president of technology transfer at OMG, based in Needham, Mass... Web services standards are just one set of middleware technologies that the OMG intends to link to ECA through mapping, according to Siegel. Currently, OMG is working to extend its mapping to ebXML (e-business XML), J2EE (Java 2 Enterprise Edition), .Net, and other environments..." See also "Object Management Group Issues Web Services for Enterprise Collaboration (WSEC) RFP."

  • [July 20, 2002] "Enterprise Collaboration Architecture (ECA). MDA for Enterprise Collaboration and Integration." By Cory Casanave (President, Data Access Technologies). Presented at the Interoperability Summit Series meeting June 26-27, 2002 in Orlando, Florida. 42 pages. ['The OMG has recently adopted the Enterprise Collaboration Architecture (ECA) as part of the UML for EDOC set of specifications. ECA describes how to model enterprise collaborations with UML and use model driven development to implement collaborative business processes using a variety of middleware technologies.'] "ECA is a profile of UML, providing a way to use UML for the specific purpose of Internal and B2B collaboration and integration. ECA is an OMG standard. You can also think of this as a modeling framework for enterprise computing. ECA is part of the OMG's Model Driven Architecture (MDA). It uses precise modeling techniques as part of the development lifecycle to speed development and provide technology independence. ECA can represent and map to the semantics for multiple technologies, integrating the relevant technologies and standards; the mappings can include Corba, .NET, J2EE, WSDL, WSFL, WS-I, etc. Much of the ECA conceptual foundation was worked out by Trygve Reenskaug (Oslo), particilarly OORAM (Object Oriented Role Analysis)..." Adapted from the text of the presentation. [cache]

  • [July 20, 2002] "TransactionMinder v5.5. Securing Web Services and Business-to-Business Integration Environments." A Netegrity White Paper. July 12, 2002. 17 pages. [TransactionMinder is one of the products announced at the SAML Interop Summit] "... For all their benefits, Web services are not without challenges. One of them is security. Today, Web services platforms are limited to basic transport-level security. The XML messages that are exchanged in Web services requests and responses must include security information that goes beyond the transport layer. Transport-level security enables point-to-point sessions. However, in Web services environments, many intermediaries can be involved in a transaction. For example, XML documents may be routed through various servers and they can be processed by multiple backend business applications necessary to complete a transaction. Each intermediary involved in a transaction can be a Web service in its own right. Intermediaries require security information from incoming requests and may need to provide additional security information to the next intermediary involved in the transaction process. Transport-level security by itself falls short of such requirements. What is needed in addition to transport-level security is a way to provide security information for the XML messages exchanged in Web services and other business-to-business transactions. In other words, in addition to securing the communication of messages (which transport-level security does well), we need to be able to selectively secure the content of the documents used in that communication, thus providing support for fine-grained access-control and dynamic authorization decisions... TransactionMinder provides a policy-based platform for securing the content of XML documents and messages used in Web services and market-leading, XML-based, business-to-business integration (B2Bi) infrastructures. TransactionMinder builds upon Netegrity's shared-services vision embodied in SiteMinder, Netegrity's flagship product for securing access to Web-based documents and business applications within a single enterprise and across multiple partners. [It provides] support for user credentials embedded in XML documents, XML Digital Signature, and the Security Assertion Markup Language (SAML); for user and trading-partner directories stored in Lightweight Directory Access Protocol (LDAP) environments or relational databases... It supports fine-grained access control, allowing for authorization policies based on information provided at any layer of the XML message (transport, envelope, or business payload)..." See: (1) "Security Assertion Markup Language (SAML)"; (2) the announcement: "Netegrity Announces New TransactionMinder Product. Provides Missing Link for Authentication and Authorization of Web Services."

  • [July 19, 2002] "A Realist's SMIL Manifesto, Part II." By Fabio Arciniegas. From July 17, 2002. ['Using SMIL 2.0 to implement narrative strategies in multimedia. In this two-part series Fabio has invited us to take another look at SMIL, the Synchronized Multimedia Integration Language. The trouble with most of the current SMIL literature, maintains Fabio, is that it fails to cater properly to either of its two potential audiences: Web developers and multimedia creatives. In the first half of this article, Fabio introduces the basics of the SMIL technology. In part 2, he brings the focus onto the creative techniques, and shows, with the aid of examples, how SMIL can be used to achieve common narrative devices in multimedia and movies.'] "In the first part of this article series, I mentioned two big problems -- and addressed the first -- obstructing widespread adoption of SMIL: Confusion about terminology, versioning, and structure Lack of business and artistic orientation from current literature The second problem has been key in making SMIL a tough sale because -- just like Flash and SVG -- it is a creative-oriented technology. It lives in the middle of the programmer-designer spectrum, where technocrat literature fails to attract many people from either side. On one hand, web designers seeing a bouncing ball on the screen tend to react with a simple 'I can easily do that in Flash', which is true. On the other hand, programmers, who appreciate the tech-appeal of the way in which the ball is made to bounce, are not being educated about the possibilities SMIL offers for expression. As a result, people on both ends tend to dismiss the whole technology as a nice toy. The goal of this article is to show SMIL's potential as a technology in service of narrative strategies, adding something extra to the media rich Web... In what follows I will explore three narrative strategies and how to implement them using three important features of SMIL 2.0. The features explained are transitions, declarative animation, and SMIL 2 events. The narrative strategies are condensation, synecdoche, and spatial montage..." See: "Synchronized Multimedia Integration Language (SMIL)."

  • [July 19, 2002] "Processing SOAP Headers." By Rich Salz. From July 17, 2002. ['Rich Salz examines the how and why of SOAP header processing. Rich takes last month's Google Web service example further and introduces an intermediary into the SOAP message chain.'] "Last month we built a simple client for the Google API. In this month's column we'll look at how SOAP headers can be used to talk to an intermediate server that adds value to the basic search service. The value-add is actually pretty silly: we'll send the query, pick one of the results at random to return, and send it back as an HTML page in Pig Latin. Our goal, however, is to understand how to process SOAP headers, and why you'd want to do so. But first I want to thank Google for providing a wonderful Web API, which it is, module the concerns I addressed in my first column. SOAP structures a message into two main parts: the headers and the body. I'll go out on a limb and say that almost all SOAP messages so far use the body. Very few put anything in the SOAP headers. I think the recent flurry of activity in SOAP security standards means that this will soon change, however, so it's worth understanding when and how to use SOAP header elements. SOAP is more than just a sender-receiver protocol, although that, too, is certainly the dominant use today. SOAP supports the concept of a message passing from a recipient, possibly through one or more intermediaries, and ending up at its destination, more precisely known as the ultimate receiver... Along the way, the intermediaries may perform processing on the message or its side-effects. For example, a message may pass through a transaction service, providing a client with guaranteed invocation in the presence of network failures; a security service may sit at an enterprise portal, providing authentication information; and so on. An important aspect of these examples is that the basic operation is unchanged. While this isn't made explicit in the SOAP specifications, it's commonly accepted that intermediaries are intended to work primarily on the metadata of the SOAP message. SOAP headers are the ideal place for such data. SOAP headers are also a good place to put optional information, and a good means to supporting evolving interfaces..." See "Simple Object Access Protocol (SOAP)."

  • [July 19, 2002] "The True Meaning of Service." By Kendall Grant Clark. From July 17, 2002. ['One oft-discussed topic is whether the future of the Web belongs to Web services or to the Semantic Web. Kendall Clark discovers that a new project, DAML-S, sets out to unify the two, and indeed could reach further into distributed computing in general.'] "At some point over the past 18 months the future direction of the Web began to be seen widely as a struggle between 'Web Services' and the 'Semantic Web'. The former was thought to be rooted in the W3C and academia, the latter in IBM-Microsoft-Sun and industry... Part of the debate between services and semantics is a replay of the debate about what makes the Web an interesting place: commerce or content? In the conventional wisdom, services represent the commerce part of the Web, while semantics represent its content... DAML-S is much more than a thought experiment... DAML-S, presently at version 0.6, is an upper ontology of web services being built on top of DAML+OIL (which is morphing into WebOnt, a W3C project). The idea is that high-level ontologies of web resources can be very useful things and, here's the kicker, web services are just a kind of web resource. Web services are resources that, as The DAML Services Coalition puts it, 'allow one to effect some action or change in the world, such as the sale of a product or the control of a physical device'... The first bit about DAML-S to grasp is that it's a high-level ontology, one that sits at the application level and is meant to answer the what- and why-questions about a web service, as opposed to the how-questions, which are the domain of WSDL... The motivations for creating DAML-S include discovery, invocation, interoperation, composition, verification, and monitoring. A part of the practical cash value of any web technology labeled 'semantic' is that the Web ought to provide something relatively useful in response to vague or even cryptic input from the end-user. If the end-user inputs "airline reservation" to her autonomous web agent, among the outputs ought to be, for example, the starting point of a web service which will guide her into making a flight reservation..." [DAML-S from the website description: "DAML-S supplies Web service providers with a core set of markup language constructs for describing the properties and capabilities of their Web services in unambiguous, computer-intepretable form. DAML-S markup of Web services will facilitate the automation of Web service tasks including automated Web service discovery, execution, interoperation, composition and execution monitoring. Following the layered approach to markup language development, the current version of DAML-S builds on top of DAML+OIL."] References: "DARPA Agent Mark Up Language (DAML)."

  • [July 19, 2002] "Implementing XPath for Wireless Devices, Part II." By Bilal Siddiqui. From July 17, 2002. ['In the second of a two-part series, we explore the implementation of XPath on wireless devices using the WAP family of standards. Bilal introduces more XPath functionality and provides the rest of the pseudo-code for its implementation.'] "In the first part of this article, we introduced XPath and discussed various XPath queries ranging from simple to complex. By applying XPath queries to sample XML files, we elaborated upon various important definitions of XPath such as location step, context node, location path, axes, and node-test. We then discussed complex XPath queries that combine more than one simple query. We also discussed the abstract structure of Wireless Binary XML (WBXML), which is the wireless counterpart of XML. Finally we presented the design of a simple XPath processing engine... In this part, we will discuss the features of XPath which allow for complex search operations on an XML file. We will discuss predicates or filtered queries and the use of functions in XPath. We will present various XPath queries for the processing of WSDL and WML. We will also enhance the simple design of our XPath engine to include support for predicates, functions, and different data types... We discuss the syntax and use of predicates and functions in XPath, present various WSDL and WML processing examples and demonstrated how to form complex XPath queries, and enhanced the design of the XPath engine introduced in the first article..."

  • [July 19, 2002] "Catalyst 2002 SAML InterOp." By Prateek Mishra (Netegrity). July 15, 2002. Document used as part of a press-briefing at Catalyst2002, San Francisco. It provides a (very)-short overview of SAML and the interOp event. Provides a SAML Introduction, Report on SAML Status, SAML InterOp Details, Relationship of SAML to other efforts. SAML (Security Assertion Markup Language) is a framework for exchange of security-related information e.g., assertions. These assertions about authentication and authorization are expressed as XML documents. SAML solves two problems: (1) Identity Federation: Provides technology to allow a business to securely interact with users originating from its vendors, suppliers, customers etc. (2) Fine Grained Authorization: Users may authenticate at one site and be authorized by another. A SAML 'profile' describes how SAML should be used to solve some business problem, e.g., Web browser profiles for Single-Sign On (part of SAML 1.0) or WS-Security profile for securing web services (currently under development by the SSTC). SAML is NOT A new form of authentication, an alternative to WS-Security, limited to legacy applications, limited to web browser applications, limited to web services security..." Details: see "Burton Group's Catalyst Conference Features SAML Interoperability Event."

  • [July 19, 2002] "Five Things You Should Know About Internet Identity." By Richard Karpinski. In InternetWeek (July 18, 2002). "Think of this as the week that Internet Identity moved from conception to reality. To be sure, many of the pieces necessary to help companies manage user identities -- and to begin to wrestle with how to make use of customer and trading partner profiles -- have been around for some time. Enterprise directories, access management, and control systems and authentication/digital signatures have been around for years. But this week saw some real progress in next-generation identity management. Vendors demonstrated interoperability using Security Assertion Markup Language (SAML), which will allow different systems to exchange standards-based identity tokens that will enable single sign-on to become a reality. The Liberty Alliance launched the first version of its specifications, which define how companies can deliver federated identity management capabilities. A slew of vendors -- including Oblix, Netegrity, OpenNetwork, Sun, Novell, Waveset and others -- detailed plans to support these emerging standards. And even Microsoft made some news, showing up at the Liberty announcement and floating a detente balloon by agreeing to support SAML across its identity and Web services security plans. So let's boil things down. What do enterprises need to know? Here's our list of the top five things we think we've culled from this week's events [...] The Three Layers Of Identity: Enterprise, B-To-B, And Public: (1) Already today, many large enterprises are rolling out major enterprise sign-sign on projects; corporate single sign-on really doesn't require any of the standards that are emerging; administration is centralized so so-called federation standards like Liberty aren't a must-have; (2) The next use case is B-to-B, and this is where SAML becomes to come in. Consider, for example, Boeing Corp., which runs a corporate extranet that sees literally thousands of trading partners looking to gain access to its collaborative applications at any time. By deploying an SAML-based solution, Boeing can let is access systems exchange SAML security information with other, non-Boeing systems -- essentially stitching together a single identity network that spans its enterprise and its trading partners. (3) Finally, the third model is the world of public identity. And this is where the concept of federation comes in. Consider a user sitting on the Dell or IBM Web site. He's looking to buy a PC. In a Liberty scenario, the site would issue a SAML token -- Liberty is strongly based on SAML -- which would not only log the user in on that site but could potentially be passed on to other parties, such as a shipping company like FedEx or UPS, without requiring the user to log in again..." References: (1) "Liberty Alliance Specifications for Federated Network Identification and Authorization"; (2) "Security Assertion Markup Language (SAML)."

  • [July 19, 2002] "Web Services Breeds Teamwork." By Vivienne Fisher. In ZDNN (July 19, 2002). "Two international standards bodies have teamed up on a forum about Web services, in a bid to clarify what's really going on. Web services has been a hot topic among businesses this year. Earlier this month, analysts told ZDNet Australia that they saw security issues as the number one roadblock to the takeup of Web services. The World Wide Web Consortium (W3C) and the Organization for the Advancement of Structured Information Standards (OASIS) have decided to organize a forum to try to educate users about the work both standards organizations have been doing. In particular, W3C's Web foundation work on XML-SIG, XKMS, Xenc, and its model for Web services architecture and the security segment. Also under discussion will be OASIS's security-related technologies such as SAML, WS-Security, and standards for access control, provisioning, biometrics and digital rights. Earlier this month it was announced that a Web services security technical committee was being formed. This followed IBM, Microsoft and VeriSign agreeing to submit the latest version of the WS-Security specification to OASIS for development, a move industry analysts predicted as a positive one for the Web services arena... Patrick Gannon, president and CEO at OASIS, said that there were a number of technical committees outside the core profile area, with most of the vendors involved in Web services having been long-standing members of the organization... Gannon described it as 'the ROI factor' -- relevance, openess and implementability. He said that this means that members work on standards that are relative to the needs of the marketplace, often within relatively short timeframes..." See the news clipping on the W3C/OASIS Security Forum.

  • [July 19, 2002] "Amazon Offers Free Web Services to Help Drive Site Traffic." By Whit Andrews and Dean Lombardo (Gartner). Gartner Note Number FT-17-4938. ['Free Web services will enable Amazon affiliates to offer content and features. The technology will work, but affiliate Web sites considering the service should temper any extravagant expectations.'] "... Amazon's choice of Web services to boost its formidable affiliate sales program, Associates, demonstrates: (1) The technology's resiliency and appeal as a method of creating richer integration than can be done using simple HTML hyperlinking; (2) The easy development Web services allow, when compared to many other forms of business-to-business application integration... Enterprises and would-be affiliates (which Amazon terms 'associates') should be on guard against their own inflated expectations of the development model. Amazon's decision to expose product display and shopping cart functions as XML-based services via SOAP links is not equivalent to profitable exploitation of such a model. Rather, it more likely will reveal which of Amazon's claimed 800,000 affiliates are genuinely creative and willing to add value to the shopping experience, as opposed to those that simply want to grab a cut of friends' and relatives' Amazon goods purchases. Participants in the Associates Program can earn up to 15 percent of sales generated by referring visitors to Enterprises that use Amazon's affiliate sales program to make products available for sale should consider AWS as a way to improve their links to the merchandiser. Retailers for whom affiliate sales models have proven effective should enrich them by adding Web services to improve partner interfaces and functionality. Although Amazon is using the Web Services Description Language (WSDL) for long-term service maintenance and definition, developers should not assume the deployment is synonymous with stability. Web services remain fairly nascent, and deployments should reflect the caution with which Gartner has advised enterprises to approach them in general..." Also in PDF format. See details in the news item "Free Web Services Facility Supports XML/HTTP and SOAP." [cache]

  • [July 18, 2002] "DON XML WG XML Developer's Guide." Version 1.1. From the DON [US Department of Navy] XML Working Group. Edited by Brian Hopkins. May 2002. 97 pages. "This document is an early deliverable of the overall DON XML strategy for employing XML within the department. It provides general development guidance for the many XML initiatives currently taking place within the DON while the DON XML Work Group (DON XML WG) is in the process of developing a long-term strategy for aligning XML implementations with the business needs of the department. It is intended to be a living document that will be updated frequently. This version of the guidance is primarily written to assist developers in creating schemas that describe XML payloads of information. It should be noted that payloads represent only one component required for secure, reliable information exchange. Other components include a specification for reliable messaging (including authentication, encryption, queuing, and error handling), business service registry and repository functions, and transport protocols. Emerging technologies and specifications are, or will shortly, provide XML-based solutions to many of these needs. The DON XML WG is developing an XML Primer that will describe each of these components and bring together the overall strategy for capitalizing on XML as a tool for enterprise interoperability... This document is primarily intended for developers already familiar with XML; however, it has a comprehensive glossary that provides good starting points for XML beginners. Some of this document focuses on XML Schemas as a tool for interoperability... This guidance applies to all activities in the DON that are implementing applications that use XML for the exchange of information with other applications via public interfaces. This version of the developers guide contains guidance of a general nature that is applicable to both document-centric and data-centric information exchanges. It also contains specific guidance for data-centric exchanges necessary for enterprise interoperability. Specific guidance for document-centric applications will be forthcoming in the next version. These recommendations are not intended to restrict the use of XML internal to systems; the DON XML WG recommends that applications separate internal XML grammars processed by application code from that used for external communications. This decoupling of internally processed XML with that which is communicated externally insulates application code from XML vocabulary evolution and allows such loosely coupled applications to stay current with the latest schemas and components promulgated by communities of interest and Voluntary Consensus Standards." References: (1) Department of the Navy XML Quickplace; (2) "US Federal CIO Council XML Working Group"; (3) "Navy Issues XML Guide"; (4) following bibliographic entry.

  • [July 18, 2002] "Navy Steams Forward on XML Standardization." By Patricia Daukantas. In Government Computer News (July 17, 2002). "Less than a year into the Navy's effort to standardize the use of Extensible Markup Language, the service's XML working group recently published the second edition of its developers' guide. Since May 1, Navy developers have had Version 1.1 of the 41-page guide that the working group published last November as Version 1.0. Both are available online. The Navy is committed to participating in XML standards organizations, said Michael Jacobs, data architecture project lead for the Navy CIO's office. The service has already joined the Organization for the Advancement of Structured Information Standards (OASIS) and is applying for membership in the World Wide Web Consortium. ... It's critical for the Navy to stay involved in development of the standards, because the Defense Department has a history of having to modify off-the-shelf applications after they are acquired, Jacobs said...The group's near-term goals include drawing up an XML implementation plan and documenting the requirements for a Navywide repository of XML schemas and code... In the developers' guide, the Navy aims to present a balance between being overly restrictive and being so loose that developers use nonstandard components at their discretion, said Brian Hopkins, an engineering consultant working with the Navy CIO's office. It's supposed to provide general guidance on XML component selection, component naming conventions and design of XML schema..."

  • [July 17, 2002] "Towards XML Based Management and Configuration." By Ted Goddard (Wind River Systems). IETF Internet-Draft. Reference: 'draft-goddard-xmlconf-survey-00.txt'. June 2002, expires: December 2002. 16 pages. "This informational document surveys a variety of existing technologies relevant to the exploration of an XML-based technology for network management and configuration. The technologies surveyed include SOAP, WSDL, SyncML, WBEM, RDF, CC/PP, and JUNOScript. The data representation capability of XML with and without schemas and DTDs is compared with SMIv2... Technologies for management and configuration cover a broad spectrum, from abstract information models, to model representation languages, to encodings, to protocols. XML itself focuses on structured data encoding, but specifications such as SOAP or WSDL use XML to represent data and to describe the communication semantics. Some specifications, such as SyncML and WBEM include explicit communication bindings, such as to OBEX or HTTP... Several of the specifications (SOAP, RDF, SyncML) use URLs or URIs to specify the target of an operation. Thus, a basic issue for an XML based management system may be a scheme for identifying network elements and their components by URIs... When combined with schemas, XML can be used to encode and validate data of arbitrary complexity in a way that is verbose but faithful to application design. XML encoded data can be easily compressed, transformed, or embedded into container formats by a wide variety of emerging tools. A standardized XML encoding for network management and configuration could be used for both persistent storage and interactive management, and could be transported over higher level protocols such as SOAP or SyncML, or directly over lower level protocols like HTTP..." [cache]

  • [July 17, 2002] "Microsoft Warms to SAML." By Cathleen Moore. In InfoWorld (July 17, 2002). "Microsoft revealed plans on Tuesday [2002-07-16] to support an emerging security standard that also forms the technology underpinnings for rival Liberty Alliance's federated identity management specification. In a talk here at the Burton Group Catalyst Conference 2002, Praerit Garg, Microsoft group program manager, detailed the company's vision for federated security, which will in the future include room for SAML (Security Assertion Markup Language). Meanwhile, Liberty Alliance on Monday announced Version 1.0 of its federated identity management specification, which is based on SAML. SAML allows authentication and authorization information to be exchanged among multiple Web access management and security products, according to OASIS (Organization for the Advancement of Structured Information Standards) officials. The specification also addresses secure single sign on, and leverages Web services standards such as XML and SOAP (Simple Object Access Protocol). In addition to its support for X509 certificates and Kerberos, Microsoft will support SAML in the WS-Security paradigm, Garg said. WS-Security is an OASIS security specification backed by Microsoft, IBM, and Verisign. 'WS-Security is a very simple model that lets you carry multiple assertions, SAML and Kerberos,' Garg said. 'It reduces friction.' SAML is just another security token format, Garg said, and WS-Security provides the common envelope to carry multiple tokens... In response to questions from the audience about what took the company so long to embrace SAML, Garg said that last year Microsoft did not really understand what SAML was about. Also, he added that the company wanted to protect existing investments in X509 and Kerberos. Garg added that Microsoft should have participated more actively in the standards development process. With a common SAML-based bridge erected, the gap between Microsoft's identity efforts and the Liberty Alliance may be shrinking. In fact, Microsoft gave its strongest indication yet that it may join forces with the Liberty Alliance..." See details of the Liberty Alliance release in the 2002-07-16 news item "Liberty Alliance Project Publishes Version 1.0 Specifications for Federated Network Identification and Authorization"; on SAML: "Security Assertion Markup Language (SAML)." On the Liberty Alliance, see: "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [July 17, 2002] "Datatypes for Document Content Validation." By Martin Bryan (The SGML Centre). July [17], 2002. Draft proposal for DSDL Part 5. "Part 5 of the Document Schema Description Language (DSDL) defines a set of primitive datatypes, a set of DSDL datatypes, a set of commonly required derived datatypes and a method for defining customized datatypes that can be used to validate the contents of specific elements and attributes within DSDL document instances. The specification also includes a set of constraints that can be used to limit the range of primitive datatypes and their derivatives. References: (1) "Document Schema Definition Language (DSDL) Proposed as ISO New Work Item" [announcement 2001-12]; (2) "RELAX NG Published as ISO/IEC DIS 19757-2 (DSDL Part 2)"; (3) web site; (4) "Document Schema Definition Language (DSDL)" [main reference document]

  • [July 17, 2002] "Sun's Java-Liberty Moves Risk Industry Scuffles." By [ComputerWire Staff]. In The Register (July 17, 2002). "A potential dispute is opening over proposed integration of Sun Microsystems Inc-backed web services security specifications with Java, writes Gavin Clarke. Sun is lending support to inclusion of Liberty Alliance Project specifications for federated single sign-in to web services in future versions of the Java platform. Jonathan Schwartz, Sun software group executive vice president, said the goal is to 'Liberty-enable the client and server.' Schwartz spoke as he announced Liberty-enabled Sun directory and network servers yesterday. With specifications embedded in the Java platform technologies like Java 2 Enterprise Edition (J2EE) for example, products like J2EE-based application servers could theoretically ship Liberty-enabled. This would minimize development efforts for ISVs and end-users as developers would not need to add Liberty-compliant APIs to Java products and applications. And with APIs shipping in popular products like application server, Liberty could also achieve pervasiveness virtually overnight. Sun's decision, though, risks re-opening old industry wounds. IBM told Computerwire it will not support addition of Liberty specifications unless they are first passed to an independent standards group, like the Organization for the Advancement of Structured Information Standards (OASIS). IBM -- like Microsoft -- remains a non-Liberty member having promoted the alternative WS-Security model. Bob Sutor, director of e-business standards, said: 'IBM will not support this until the work Liberty does is moved to a real standards organization.' IBM is also concerned at the degree of control Sun wields over the Java Community Process (JCP), a 300-member body that debates and approves Java specifications. Sun has constitutional power to veto changes to the Java language. IBM is a JCP participant... Key to the battle are application server vendors BEA Systems Inc and Oracle Corp who have yet to declare for Liberty..." On the Liberty Alliance, see: "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [July 16, 2002] "Liberty Alliance Details Network Identity Specifications." By Paul Krill and Matt Berger. In InfoWorld (July 15, 2002). "The Liberty Alliance Project on Monday is announcing availability of Version 1.0 of its specifications for a federated network identity system for e-commerce and Web services. According to the Liberty Alliance, the specifications represent the first step in enabling an open, federated network identification infrastructure to link similar and disparate systems. Through the specifications, users can decide whether to link accounts with various identity providers. Companies such as Sun Microsystems, Nokia, MasterCard, and American Express are members of Liberty Alliance and expressed their support of the specifications. Various vendors on Monday may provide road maps for supporting the Liberty effort in products... The next version of the specification will address permission-based attribute sharing, in which organizations can share user information according to permissions granted by the user. Version 1.0 enables 'opt-in' account linking, in which users can choose to link accounts with different service providers. Other functions of the specification include simplified sign-on for linked accounts, in which users can authenticate at one account and navigate to other linked accounts without having to again log in; authentication context, which lets institutions or companies communicate the type of authentication that should be used for user log-ins; and global log-out, in which a user can log out of one site and be automatically logged out of other linked sites. The specification leverages protocols such as SAML (Security Assertion Markup Language)... Said analyst James Kobielus, senior analyst at the Burton Group in Alexandria, Va., in an e-mail response to questions about the Liberty Alliance Version 1.0 specifications: 'Users will be able to optionally link -- and de-link -- their accounts, so as to reduce the number of times they need to enter user IDs and passwords when transacting business across one or more 'federated' or affiliated organizations. The principal shortcomings of the Liberty Alliance 1.0 specifications is that they are new, unproven in the field, rely on the still immature but promising SAML 1.0 standard, and leave many complex technical integration details to be worked out by organizations that implement Liberty-enabled account linking. Liberty 1.0, like SAML 1.0, which Liberty's specs extend, still needs to be implemented and integrated in a critical mass of commercial products and services," Kobielus said..." See details in the 2002-07-16 news item "Liberty Alliance Project Publishes Version 1.0 Specifications for Federated Network Identification and Authorization." On the Liberty Alliance, see: "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [July 15, 2002] "Speech Vendors Target Developers with Multimodal Tools." By Ephraim Schwartz. In InfoWorld (July 15, 2002). "Dueling speech technology announcements will come to the fore on Monday when the SALT (Speech Application Language Tags) Forum ships Version 1.0 of its multimodal specification and IBM reveals its X+V technology for multimodal applications. While the W3C (World Wide Web Consortium) standards body reviews IBM's X+V as a solution to multimodal capabilities on handheld devices, sources close to the W3C say that it will soon acknowledge receipt of a SALT Forum submission for the same technology. Multimodal refers to a developer's ability to mix both a voice and visual interface into a single application. The IBM technology, X+V, an acronym for xHTML plus Voice, will ship in October and will be included as an extension to Big Blue's WebSphere Everyplace Access in the first half of next year... IBM submitted its specification late last year for X+V to the W3C and it has been acknowledged -- terminology for the standards body's willingness to consider a specification as a standard. The W3C has, in fact, set up a working group to focus on multimodal, Soares said... In October IBM also will ship a set of tools for developers to write multimodal applications using VXML. Despite appearances of dueling standards, the W3C will most likely come up with a single solution, according to Bill Meisel, president of TMA Associates in Tarzana, Calif. 'VXML will be broken into pieces and some of the same ideas and concepts in SALT will be applied to those pieces,' Meisel said. However, there is one non-technical difference between the two proposed standards that could lead to problems down the road. While the SALT specification is royalty-free the VXML specification is not, Meisel said... However, it is highly unlikely that any company will try and sabotage the VXML initiative by claiming someone is violating its patent, Meisel added, although the possibility does cloud the issue. The release on Monday by the SALT Forum is the first fully functioning specification suitable for developers to take and build applications around, said Peter Gavalakis, marketing manager, at Intel in Parsippany, N.J. Gavalakis also sees the possibility that the standards organizations might take a look at the different paths toward multimodal and decide to converge them..."

  • [July 15, 2002] "Federated Digital Rights Management: A Proposed DRM Solution for Research and Education." By Mairéad Martin, Grace Agnew, David L. Kuhlman, John H. McNair, William A. Rhodes, and Ron Tipton. In D-Lib Magazine Volume 8 Number 7/8 (July/August 2002). ISSN: 1082-9873. "This article describes efforts underway within the research networking and library communities to develop a digital rights management (DRM) solution to support teaching and research. Our specific focus is to present a reference architecture for a federated DRM implementation that leverages existing and emerging middleware infrastructure. The goals of the Federated Digital Rights Management (FDRM) project are to support local and inter-institutional sharing of resources in a discretionary, secure and private manner, while endeavoring to maintain a balance between the rights of the end-user and those of the owner. 'Federated' in our project title refers to the shared administration of access controls between the origin site and the resource provider: the origin site is responsible for providing attributes about the user to the resource provider. FDRM applies and extends the federated access control mechanisms of Shibboleth, a project of the Internet2 Middleware Initiative to develop 'architectures, policy structures, practical technologies, and an open source implementation to support inter-institutional sharing of web resources subject to access controls.' FDRM is being pursued by members of the VidMid Video-on-Demand Working Group, a collaboration between the Internet2 Middleware Initiative and the Video Development Initiative. The VidMid agenda is aligned with the goals of the National Science Foundation Middleware Initiative, a national effort to develop enabling middleware for applications in science, engineering, and education. While VidMid work is centered on enabling digital video applications, the application of FDRM is extensible to digital content in any format... FDRM employs existing Shibboleth federated authentication and authorization mechanisms and uses open source protocols, such as the Security Assertions Markup Language (SAML) and the Simple Object Access Protocol (SOAP)... FDRM is designed for use with any authentication and authorization system, be it simple .htaccess files, Windows Domain Authentication, or LDAP authentication modules (LDAP). The architecture presented here is modeled on the Internet2 Shibboleth project. The Shibboleth model and architecture make a compelling foundation on which to build FDRM, since it is designed for federated application, built on open source XML-based messaging standards and open source directories, and is extensible, as our proposed reference architecture demonstrates. Shibboleth is also predicated on the establishment of trust communities. Most significantly, Shibboleth's emphasis on user privacy is one that is wholly shared by FDRM. This emphasis will mark a significant distinction between the FDRM solution and some current commercial DRM systems. The ability to track usage, protect the integrity of a resource, and manage version control is possible in FDRM, without compromising the privacy of the individual. In addition to applying the federated authentication and authorization mechanisms of Shibboleth, FDRM federates asset management..." See also: (1) "XML and Digital Rights Management (DRM)"; (2) "Security Assertion Markup Language (SAML)."

  • [July 15, 2002] "SAML Frames ID Debate." By Brian Fonseca. In InfoWorld (July 15, 2002). "Determined to agree on a unified approach to Web services interoperability, a consortium of vendors this week will rally behind SAML (Security Assertion Markup Language) during a public demonstration of the specification. Taking center stage at The Burton Group's Catalyst 2002 conference in San Francisco this week, a range of vendors, including IBM/Tivoli, Novell, Netegrity, ePeople, RSA Security, and Oblix, will use the event to announce plans to implement the authorization aspects of SAML in upcoming products. SAML is expected to be officially ratified by the Organization for the Advancement of Structured Information Standards by November, according to industry sources. Version 1.0 of SAML is designed to facilitate the exchange of authentication information among Web access management and security products within a Web browser profile and represents a competing alternative to Microsoft's Kerberos and Passport SSO (single sign-on) technologies... The cross-enterprise Web SSO demonstration event at the conference will highlight the Identity and Access Management Federation, which features companies implementing fellow vendors' Web access management products to share, authenticate, attribute, and authorize information. A second demonstration will provide authentication at portal sites and then access Web resources managed by other federated content sites, according to Burton officials. SAML's rising popularity among both vendors and enterprises comes at a time when two dueling industry camps, the Liberty Alliance and Microsoft, fight to establish competing federated SSO systems for facilitating e-commerce . The Liberty Alliance will on Monday lift the cover off its long-awaited SSO standard. The unveiling comes on the heels of Microsoft's announcement last week of a deal with Arcot Systems to enable users of its system to make online transactions using Visa and MasterCard, both of which are Liberty Alliance members..." See: "Security Assertion Markup Language (SAML)"; details in the news item of 2002-07-15: "Burton Group's Catalyst Conference Features SAML Interoperability Event."

  • [July 15, 2002] "Accent on Access Control. Conference to highlight SAML, an emerging standard for identity management." By John Fontana. In Network World (July 15, 2002). "Industry heavyweights this week will throw their support behind a developing standard that promises to help network executives build centrally managed, easily sharable user identity systems. At the annual Burton Group Catalyst Conference, a parade of vendors, including RSA Security, Netegrity, Oblix and Novell, will announce support for Security Assertion Markup Language (SAML), an emerging XML-based standard for exchanging authentication and authorization information. Also at the conference, those vendors will join Baltimore Technologies, Crosslogix, Sun, IBM's Tivoli Systems and others in a SAML interoperability demonstration. The biggest shot in the arm, however, will come from the Liberty Alliance, a group of vendors and corporate users who have spent the past six months creating a single sign-on specification. The group will release its work, and announce it is supporting SAML and adding nearly 20 new members... The wave of support for SAML likely will stamp it as a de facto standard, although it won't get official approval from the Organization for the Advancement of Structured Information Standards (OASIS) until fall at the earliest. The only snag could be that Microsoft has yet to commit to SAML, instead focusing on Kerberos as a way to pass authentication information. But Microsoft's commitment to WS-Security, a set of proposed standards it created with IBM and VeriSign and now under review by OASIS, could eventually bring the company into the fold. SAML is but one important step in creating user authentication and authorization information that is portable across corporate networks so a user authenticated on one company's network can be recognized on another and granted or denied authorization to access resources based on that authentication. This sharing of user identity is being referred to as federated identity management and is emerging as a key technology for distributed e-commerce and Web services... SAML does not specify any policy for using identity information. The Liberty Alliance specification will build on top of SAML, adding some policy protocols. Also, SAML does not incorporate a way to establish trust between business partners exchanging identity information. And SAML, which has strong authentication services, will need the help of another emerging XML-based protocol called XML Access Control Markup Language to solve the more complex issue of authorization. A third protocol - the Services Provisioning Markup Language - also will have to be incorporated. There are other, competing efforts. Microsoft is working on integrating its Passport service with Kerberos, as opposed to SAML, to create a single sign-on credential similar to Liberty's work. Microsoft also is developing TrustBridge, another product to unify sign-on across Microsoft environments, and focusing on Extensible Rights Markup Language, an authorization protocol similar to XACML..." See: "Security Assertion Markup Language (SAML)"; details in the news item of 2002-07-15: "Burton Group's Catalyst Conference Features SAML Interoperability Event."

  • [July 12, 2002] "Grid Service Specification." By Steven Tuecke, Karl Czajkowski, Ian Foster, Jeffrey Frey, Steve Graham, and Carl Kesselman. Draft June 13, 2002. [Section 15 is to present the XML and WSDL Specifications.] "Building on both Grid and Web services technologies, the Open Grid Services Architecture (OGSA) defines mechanisms for creating, managing, and exchanging information among entities called Grid services. Succinctly, a Grid service is a Web service that conforms to a set of conventions (interfaces and behaviors) that define how a client interacts with a Grid service. These conventions, and other OGSA mechanisms associated with Grid service creation and discovery, provide for the controlled, fault resilient, and secure management of the distributed and often long-lived state that is commonly required in advanced distributed applications. In a separate document, we have presented in detail the motivation, requirements, structure, and applications that underlie OGSA. Here we focus on technical details, providing a full specification of the behaviors and Web Service Definition Language (WSDL) interfaces that define a Grid service... The Open Grid Services Architecture (OGSA) integrates key Grid technologies (including the Globus Toolkit) with Web services mechanisms to create a distributed system framework based around the Grid service. A Grid service instance is a (potentially transient) service that conforms to a set of conventions (expressed as WSDL interfaces, extensions, and behaviors) for such purposes as lifetime management, discovery of characteristics, notification, and so forth. Grid services provide for the controlled management of the distributed and often long-lived state that is commonly required in sophisticated distributed applications. OGSA also introduces standard factory and registry interfaces for creating and discovering Grid services..." Note in this connection "Euroweb 2002 Conference. The Web and the GRID: From E-Science to E-Business" ['How will frameworks like web services, GRID Services, and .NET address the issue of Internet-aware programs using the services that are offered by other programs? What is XML's role in managing and routing data and services?']

  • [July 12, 2002] "The Physiology of the Grid. An Open Grid Services Architecture for Distributed Systems Integration." By Ian Foster, Carl Kesselman, Jeffrey M. Nick, and Steven Tuecke. June 22, 2002. "We present an Open Grid Services Architecture that addresses these challenges. Building on concepts and technologies from the Grid and Web services communities, this architecture defines a uniform exposed service semantics (the Grid service); defines standard mechanisms for creating, naming, and discovering transient Grid service instances; provides location transparency and multiple protocol bindings for service instances; and supports integration with underlying native platform facilities. The Open Grid Services Architecture also defines, in terms of Web Services Description Language (WSDL) interfaces and associated conventions, mechanisms required for creating and composing sophisticated distributed systems, including lifetime management, change management, and notification. Service bindings can support reliable invocation, authentication, authorization, and delegation, if required. Our presentation complements an earlier foundational article, 'The Anatomy of the Grid,' by describing how Grid mechanisms can implement a service-oriented architecture, explaining how Grid functionality can be incorporated into a Web services framework, and illustrating how our architecture can be applied within commercial computing as a basis for distributed system integration -- within and across organizational domains..."

  • [July 11, 2002] "Standards Stalled Over Royalty Disputes." By Paul Festa and Stephen Shankland. In ZDNN (July 11, 2002). ['A key Web standards body is bracing for a vote next week that could decide once and for all how it will handle patented technology that comes with royalties attached.'] "Pro- and anti-royalty factions have debated the issue since last fall, in a high-stakes dispute that could determine what technologies may be considered for common use at the most fundamental levels of Web design. That in turn will have strategic fallout for high-tech giants struggling for dominance in creating ambitious new technologies such as interactive services -- a battle in which Microsoft and IBM stand virtually alone with the clout to push proprietary applications on everyone else. Longtime Microsoft foe Sun Microsystems, for one, is trying to ensure that next-generation 'Web services' Internet standards are royalty-free, exerting pressure on IBM, Microsoft and others to follow suit. Next week's decision could mark a major shift in this and other standards battles if patented technology is rejected by the World Wide Web Consortium (W3C). If a final effort to craft a royalty exception is not hammered out by next week, it could die altogether, negotiators close to the talks say... The W3C found itself in a heated controversy last fall after members proposed charging royalties for patented technologies that found their way into W3C recommendations. This spring, the W3C retreated from the proposal, pledging to keep its recommendations royalty-free. Despite that pledge, the W3C is still trying to decide what to do with patented technologies that owners are unwilling to contribute with no strings attached. As a result, the W3C's Patent Policy Working Group is trying to hammer out an exception to the royalty-free policy. That has patent and royalty critics up in arms, helping mire a committee searching for consensus in an impasse earlier this month. On a teleconference July 1, members discussed a proposal from three unnamed members to establish an exception to the royalty-free policy for a so-called RAND (reasonable and nondiscriminatory) license. Under the proposal, the W3C would split its recommendation, with the 'core' of a specification produced by the W3C without royalties, and patented 'extensions' with royalties worked out either by the W3C or other groups. The proposal did not win sufficient support to advance, and the head of the working group warned that public opinion was siding against anything like it... By dropping a royalty-free exception, the W3C would have to leave out technologies with royalties attached, unless the patent holders agreed to waive them. That could put major intellectual property holders such as Microsoft and IBM in a tough spot: Either give away their technology or remove it from consideration as a standard... Sun has also emerged as a major proponent of unpaid standards, convening a royalty-free standards group called the Liberty Alliance Project to develop online authentication services. In addition, the company held out for royalty-free provisions before joining a high-profile Web security group last month. 'We do not believe in taxing people for use of those standards,' Jonathan Schwartz, Sun's software chief, said last month..." Note Daniel Weitzner's clarification on the PPWG's proposed timeframe and the Summary of 8-July-2002 W3C Patent Policy Working Group Teleconference. See related references "Patents and Open Standards" and particularly on W3C's patent policy.

  • [July 11, 2002] "Localizable DTD Design." By Richard Ishida (Xerox Global Services). In MultiLingual Computing and Technology Volume 13 Issue 5 [#49] (July/August 2002), pages 43-49. ISSN: 1523-0309. ['Effective localization of XML documents begins with the development of an internationalized document structure. Richard Ishida writes about creating localizable DTD (Document Type Definition) design so that XML documents can be smoothly localized.'] "If you are creating XML (or SGML) documents that will be translated, there are things you should have built into your Document Type Definition (DTD) to enable localisation to go smoothly and efficiently. This article looks at some of the key issues. The article also refers to standard topics such as character encoding and language declarations, but covers other topics such as implementation of emphasis and style conventions, handling of citations, use of text in attribute values, and the need for an element like HTML's SPAN. In addition, other topics that have traditionally been associated with translation of user interface messages become applicable due to the nature of XML documents. These include the provision of designer's notes, identification of non-translatable text, and use of element IDs for automatic translation of elements. The paper assumes some familiarity with XML and DTD concepts, but remains at the conceptual level, rather than proposing specific DTD constructs..." See also these related papers: (1) "ITS Requirements," edited by Richard Ishida and Yves Savourel [Working Draft 6-June-2001]; (2) "Localisation Considerations in DTD Design," by Richard Ishida (Xerox Global Services); (3) "Localisation Considerations in DTD Design," by Richard Ishida (LISA Forum, Tokyo, June 2000; 15 pages). For general references see "Markup and Multilingualism." [Cache: ITS Requirements, DTD Design]

  • [July 11, 2002] "Multilingual Integration of GMS and Content Systems." By Shang-Che Cheng (Uniscape/TRADOS). In MultiLingual Computing and Technology Volume 13 Issue 5 [#49] (July/August 2002), pages 35-38. ISSN: 1523-0309. ['Merging enterprise infrastructure with a globalization management system is key to meeting e-business challenges.'] "Global 1000 companies developed and maintained a presence in many countries long before the Internet became a medium for doing business. But managing Web sites, software and content on-line is a new challenge for those companies aggressively deploying e-business applications. The demands for greater efficiency, cost savings and a common approach to the management of global content are compelling companies to seek new enterprise-scale software for the deployment of content in all markets. Seamlessly integrating current enterprise infrastructure and business processes with a globalization management system (GMS) is essential to its success. This article briefly introduces the evolution of globalization and GMS technologies. It then discusses the basic integration methods between GMS and content systems, intermedium file formats, solutions under a heterogeneous environment, and integration models in the current market. ... The Intermedium Format: Each application interface is required to deliver translated content in its original format to target locations unless certain transformations have been defined in the process. For example, a database connector pulls original content out of a database table and converts the row items into an XML format. The connector then sends the content for localization based on predefined rules. Once the translation is finished, the connector receives the translated content in the same XML format. The database connector then deploys the translated content into the targeted location. If the targeted location is a database, the connector will convert each XML entry into an SQL statement with proper primary key(s), such as row ID, and then execute either an update or insert operation to upload the translated content into the target database... Traditional fixed API integration methods will be replaced by self-contained, self-describing modular applications with standard interfaces that can be published, located and invoked across the Internet. Web Services and XML provide functions that enable remote applications to access both the data and services of these applications as standard objects. This will be the trend for the integration of GMS and content systems as well. The Internet brings the whole world much closer than ever before; global readiness is a must for all enterprises. Rapid integration of current enterprise infrastructure and business processes with a globalization management system is the key to success for global enterprises. Proper integration not only can save costs, but also can deliver consistent worldwide content quickly..."

  • [July 11, 2002] "Integrating Data at Run Time with XSLT Style Sheets. Creating presentation markup using XML, XSLT, and the MVC design pattern." By Andre Tost (Senior Software Engineer, IBM). From IBM developerWorks, XML Zone. July 2002. ['Many applications now take advantage of XML to format business data. This allows the use of self-describing, tagged data that can be handled on a wide range of platforms and programming languages. Integration between heterogeneous applications is made easier through the use of XML data formats. Web services technology, for example, promotes the use of XML-based message formats for backend application data. However, integrating that data into user output during run time can be a challenge. In this article, Andre Tost describes how data integration can be achieved through the use of XSLT style sheets.'] "This article describes a mechanism for developing application business data and presentation data separately, and then putting them together using a generic XSLT style sheet. The application business data is formatted in XML, while the presentation data can be created with a traditional tool. The presentation data is enhanced with additional attributes that the style sheet then uses to apply presentation details to the actual content. For the creation of the final format, no special programming is necessary, as this process is completed by the XSLT processor. To get the most out of this article, you should have a basic knowledge of XML and XSLT... All the examples shown here were produced using the Apache Xerces XML parser, version 2.0.1 and the Apache Xalan XSLT processor, version 2.3.1... The solution I introduce in this article, in which application business data and presentation data are developed separately and then put together by a generic XSLT style sheet, is by no means complete; it only shows the concept. I expect that similar (but complete and robust) solutions will be developed over time. Generic style sheets like the one shown in Listing 5 can then be made available and reused where applicable. For example, the Organization for the Advancement of Structured Information Standards (OASIS) has launched the Web Services for Interactive Applications initiative [WSIA]; this initiative tries to define a component model for visual Web services, also based on the MVC pattern, which will lead to a technique similar to the one presented here..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [July 11, 2002] "Speech Recognition Grammar Specification Version 1.0." Edited by Andrew Hunt (SpeechWorks International) and Scott McGlashan (PipeBeach). W3C Candidate Recommendation 26-June-2002. Produced as part of the W3C Voice Browser Activity. Version URL: Latest version URL: The document "defines syntax for representing grammars for use in speech recognition so that developers can specify the words and patterns of words to be listened for by a speech recognizer... The primary use of a speech recognizer grammar is to permit a speech application to indicate to a recognizer what it should listen for, specifically: (1) Words that may be spoken, (2) Patterns in which those words may occur, and (3) Spoken language of each word. Speech recognizers may also support the Stochastic Language Models (N-Gram) Specification. Both specifications define ways to set up a speech recognizer to detect spoken input but define the word and patterns of words by different and complementary means... The syntax of the grammar format is presented in two forms, an Augmented BNF Form and an XML Form. The specification makes the two representations mappable to allow automatic transformations between the two forms... Both the ABNF Form and XML Form have the expressive power of a Context-Free Grammar (CFG). A grammar processor that does not support recursive grammars has the expressive power of a Finite State Machine (FSM) or regular expression language. For definitions of CFG, FSM, regular expressions and other formal computational language theory... This form of language expression is sufficient for the vast majority of speech recognition applications." Status: "The exit criteria for this Candidate Recommendation phase is at least two independently developed interoperable implementations of each required feature, and at least one implementation of each feature. Detailed implementation requirements and the invitation for participation in the Implementation Report are provided in the Implementation Report Plan. Note, this specification already has significant implementation experience that will soon be reflected in its Interoperability Report. We expect to meet all requirements of that report within the two month Candidate Recommendation period, closing 30th August 2002."

  • [July 11, 2002] "Speech Engine Remote Control Protocols by treating Speech Engines and Audio Sub-systems as Web Services." By Stéphane H. Maes and Andrzej Sakrajda (IBM T.J. Watson Research Center). IETF Internet Draft. SPEECHSC. Reference: 'draft-maes-speechsc-web-services-00'. Informational. June 23, 2002, expires December, 2002. "This document proposes the use of the web service framework based on XML protocols to implement speech engine remote control protocols (SERCP). Speech engines (speech recognition, speaker, recognition, speech synthesis, recorders and playback, NL parsers, and any other speech processing engines [e.g., speech detection, barge-in detection etc]) as well as audio sub-systems (audio input and output sub-systems) can be considered as web services that can be described and asynchronously programmed via WSDL (on top of SOAP), combined in a flow described via WSFL, discovered via UDDI and asynchronously controlled and via SOAP that also enables asynchronous exchanges between the engines. This solution presents the advantage to provide flexibility, scalability and extensibility while reusing an existing framework that fits the evolution of the web: web services and XML protocols... The proposed framework enables speech applications to control remote speech engines using the standardized mechanism of web services. The control messages may be tuned to the controlled speech engines..." Status: "The document is informational; it illustrates how web services could be used. It is not a detailed specification. This is expected to be the output of the SPEECHSC activity, if it is decided to go in this direction. It also enumerates the requirements that have led to selecting a web service framework... The document uses the terminology SERCP (Speech Engine Remote Control Protocols) to be consistent with the terminology used in other documents exchanged at ETSI, 3GPP and OMA while distinguishing from the detailed specification proposed by MRCP." [...] "We propose an XML-based syntax with clear extensibility guidelines. The web service framework is inherently extensible and enables the introduction of additional parameters and capabilities. The SERCP syntax and semantics is designed to support the widest possible interoperability between engines by relying on message invariant across engine changes as discussed in section 4.2. This should enable to minimize the need for extensions in as many situations as possible. Existing speech APIs and the MRCP [Media Resource Control] syntax have been considered as starting points..." See also "Usage Scenarios for Speech Service Control (SPEECHSC)," published as IETF 'draft-maes-speechsc-use-cases-00'. [Cache: web services, use cases]

  • [July 10, 2002] "Direct Internet Message Encapsulation (DIME)." IETF Internet-Draft. Reference: draft-nielsen-dime-02. June 17, 2002; expires December 2002. By Henrik Frystyk Nielsen (Microsoft), Henry Sanders (Microsoft), Russell Butek (IBM), and Simon Nash (IBM). 26 pages (with 17 references). Direct Internet Message Encapsulation (DIME) is a lightweight, binary message format that can be used to encapsulate one or more application-defined payloads of arbitrary type and size into a single message construct. Each payload is described by a type, a length, and an optional identifier. Both URIs and MIME media type constructs are supported as type identifiers. The payload length is an integer indicating the number of octets of the payload. The optional payload identifier is a URI enabling cross-referencing between payloads. DIME payloads may include nested DIME messages or chains of linked chunks of unknown length at the time the data is generated. DIME is strictly a message format: it provides no concept of a connection or of a logical circuit, nor does it address head-of-line problems..." See: "Direct Internet Message Encapsulation (DIME)." [cache]

  • [July 10, 2002] "WS-Attachments." IETF Internet-Draft. Reference: draft-nielsen-dime-soap-01. June 17, 2002; expires December 2002. 16 pages (with 20 references). By Henrik Frystyk Nielsen (Microsoft), Erik Christensen (Microsoft), and Joel Farrell (IBM). "This document defines an abstract model for SOAP attachments and based on this model defines a mechanism for encapsulating a SOAP message and zero or more attachments in a DIME message. SOAP attachments are described using the notion of a compound document structure consisting of a primary SOAP message and zero or more related documents known as attachments... MIME multipart itself adds extra parsing overhead in order to determine the various parts of the MIME message. In addition, while MIME is a successful general message description format, the problem of encapsulating and delimiting messages is a much more focused problem that does not require the full expressiveness of MIME. DIME is a simple, lightweight message format that has been explicitly limited to provide a few core services needed for encapsulating messages. The purpose of this document is to describe how to use existing facilities provided by SOAP, DIME, and URIs to encapsulate SOAP messages and related documents (often known as attachments)..." [cache]

  • [July 10, 2002] "Microsoft Delivers Updated SOAP Toolkit." By Richard Karpinski. In InternetWeek (July 10, 2002). "Microsoft Wednesday [2002-07-10] released a new version of its Simple Object Access Protocol (SOAP) toolkit, an add-on that helps developers build Web services using its Visual Studio development tool. Microsoft's SOAP toolkit 3.0 includes support for DIME (Direct Internet Message Encapsulation) and WS-Attachments, both of which define how to package binary data up with SOAP messages. While SOAP provides an envelope for exchanging structured XML information, it does not have an efficient way of exchanging binary data, Microsoft said. DIME and WS-Attachments provide a binary message format that reduces the overhead of dealing with attachments in SOAP. DIME is the message format; WS-Attachments defines how to encapsulate a SOAP message and attachments in a DIME message..." See: "Direct Internet Message Encapsulation (DIME)."

  • [July 10, 2002] "BEA Upgrades Portal Server." By James Niccolai. In InfoWorld (July 10, 2002). "Bea Systems announced the availability this week of WebLogic Portal 7.0, the latest version of its software for building corporate Web sites that provide access to a variety of applications and services for employees, customers and partners. The product was made available last month as part of BEA's WebLogic Platform 7.0 release, a bundled offering that includes BEA's portal server, application server and integration server, and its Workshop developer environment. This week marks the availability of the WebLogic Portal 7.0 as a standalone product. It includes enhancements designed to speed up the process of building portals, including a portal wizard that can simplify the management of J2EE (Java 2 Enterprise Edition) applications and make it easier for developers to work with portal templates... WebLogic Portal 7.0 also ships with new portlets for creating catalogs and shopping carts, building purchasing processes, and other commerce services. Portlets are reusable elements, such as a stock ticker or an e-mail application, that are assembled to make a portal. Other features aim to improve usability for end users, such as the ability to edit the names of tabs within a portal site, and collaborate on documents over the Web..." [See the Portlet specification of JSR 168, supported by BEA: it defines a "Portlet API that provides means for aggregating several content sources and applications front ends. It will also address how the security and personalization is handled. Portlets are web components -like Servlets- specifically designed to be aggregated in the context of a composite page. Usually, many Portlets are invoked to in the single request of a Portal page. Each Portlet produces a fragment of markup that it s combined with the markup of other Portlets, all within the Portal page markup..."] WebLogic Portal 7.0 may be downloaded; see details in the related announcement: "BEA Delivers Industry's First Unified Portal Platform. BEA WebLogic Portal 7.0 Features Native Support for Web Services to Simplify Portal Development."

  • [July 10, 2002] "Orchestrate Services." By Jon Udell . In InfoWorld (July 05, 2002). "For years the industry has dreamed of modeling business processes in software and combining them like Tinker Toys. Web services orchestration, the new term for that old idea, becomes more interesting as raw services multiply behind firewalls. But as integration vendors point out, the orchestration layers of the Web services stack aren't yet baked. The standards pioneers -- Microsoft, IBM, and now Sun Microsystems and BEA Systems -- are busy in the kitchen. Two proposed XML grammars for describing the orchestration of Web services -- Microsoft's XLANG, used by BizTalk, and IBM's WSFL (Web Services Flow Language) -- were widely expected to have merged by now into a joint World Wide Web Consortium (W3C) submission. That hasn't happened. Meanwhile, Sun, BEA, SAP, and Intalio have introduced a third candidate: WSCI (Web Service Choreography Interface). The relationships among these three proposals -- and others, including Intalio's BPML (Business Process Markup Language) and ebXML's BPSS (Business Process Schema Specification) -- are murky. XLANG, WSFL, and WSCI talk about two different orchestration layers. One layer deals with the public protocols of business collaboration, which WSFL and WSCI call the global model. The other layer describes private protocols, which WSFL calls the flow model. XLANG addresses both layers, but less explicitly. Ideally, a W3C recommendation will emerge that splits these apart neatly, but it's not yet clear how to do that. All three XML grammars define standard programming language constructs for sequences, loops, spawning, conditional execution, and exception handling. XLANG and WSCI have roots in a formal algebra called pi-calculus, which is used to model the kinds of parallel, message-driven computations that make orchestration a uniquely difficult problem... For XLANG users, help is on the way, according to Dave Wascha, lead product manager for BizTalk at Redmond, Wash.-based Microsoft. XML should be used to specify service choreography, he says, but it need not be used to implement it. A conventional syntax can do this more naturally, as Microsoft has shown in experiments using C#. In a similar vein, Java implements the conversational Web services defined in BEA's WebLogic Workshop, and Collaxa's ScenarioBeans mixes workflow tags with Java, creating a JSP (JavaServer Pages)-like metaphor that aims to be intuitive for developers. Shared public protocols, however, must be language-neutral, which mandates XML. At this level, all three proposals explore grammars that describe dynamic interplay among services offering static WSDL (Web Services Description Language) interfaces. With respect to loosely coupled, asynchronous services, this is where rubber meets road. Problems that far-flung service orchestration forces us to confront include message correlation, long-running transactions, and human-usable abstractions... BEA's WebLogic Workshop makes pairwise conversations transparently simple. A developer declares a service "conversational," and an ID is attached to subsequent message flow. By doing a little surgery on its SOAP (Simple Object Access Protocol) headers, a .Net client can join a WebLogic-style conversation. But orchestration, in the general case, is a many-to-many conversation -- and one in which not every participant will necessarily even speak SOAP..."

  • [July 10, 2002] "WSOS Tunes Up Services." By James R. Borck. In InfoWorld (July 05, 2002). "Web Services aim to reduce the complexity and cost of business processes automation. But using application components distributed beyond the control of any single IT department raises serious reliability concerns. Knowing that CTOs must better orchestrate BPM (business process management) in such environments, Collaxa brings to market WSOS (Web Service Orchestration Server). WSOS provides mechanisms to control the sequencing and flow of Web services conversations by stitching ad hoc services together with underpinnings of reliability. WSOS establishes reliability by enabling recovery from failure and ensuring the safe completion of a business process by maintaining persistence, even when transactions are extended in time and involve multiple partners... Based on standards such as XML, SOAP (Simple Object Access Protocol), WSDL (Web Services Description Language), and BTP (Business Transaction Protocol), ScenarioBeans contain business logic, deployment descriptors, and the elements necessary to make an application available as a Web service, including WSDL files and SOAP listeners... The Developer Console serves as the hub for monitoring vital statistics on running services in real time. We easily traversed the hierarchical view of current transactions, monitored activity and statistics on response time and exceptions, deployed services, and tackled reporting issues. But for all its capabilities, its limited breadth also hinders its usefulness. Although Collaxa has imminent plans to release support for the open-source Jboss, provisions for any enterprise-class application servers, such as Oracle9i and IBM WebSphere, are not expected until year's end. Also, it uses the RPC (Remote Procedure Call) style of SOAP services, making it incompatible with Microsoft's document style layout. It has no integrated communication support for other XML-based BPM languages on the table, such as WSFL (Web Services Flow Language) or XLANG..."

  • [July 10, 2002] "Collaborative Challenges." By David L. Margulius. In InfoWorld (July 03, 2002). ['Enterprises want to leverage existing systems investments and functionality by developing new collaborative applications to facilitate business processes. In response, vendors are starting to offer XML-based middleware platforms to enable fast and economical construction of these collaborative applications.'] "Several thorny problems need to be solved to make collaborative applications work, including data transformation, business process coordination, and transactionality. Current efforts, based largely on existing open-system interfaces such as XML, are a transitional step toward a more services-oriented approach once Web services interfaces and protocols such as SOAP (Simple Object Access Protocol) and WSDL (Web Services Description Language) become widespread. (1) Data transformation and synchronization: Although many legacy systems and packaged applications speak XML, their data models and schemas are frustratingly different... (2) Managing process flows: Another obstacle to collaborative app development is the lack of standards for describing and orchestrating business process flows across multiple systems..."

  • [July 10, 2002] "WS-Next: Plotting A Web Services Security Road Map." By Richard Karpinski. In InternetWeek (July 08, 2002). [Re: the announcement that Microsoft, IBM, and VeriSign will bring the WS-Security work to OASIS for development in a Technical Committee.] "When IBM, Microsoft, and VeriSign announced last week they planned to move their would-be WS-Security standard to the OASIS group, most of the attention went to the political gamesmanship. The trio had managed to convince Sun Microsystems to join the effort, avoiding a prolonged battle over how to add baseline security -- things like encryption and digital signature support -- to Web services... Most simply, WS-Security defines a set of Simple Object Access Protocol (SOAP) headers that can be used to implement security measures for Web services. The base WS-Security specification describes how to add encryption and digital signatures to Web services (in each case supporting a W3C working group effort, XML-Encryption and XML-Signatures, respectively.) WS-Security also defines a general mechanism for passing around arbitrary security tokens, though it doesn't define how those tokens work... The overall idea with WS-Security is to build a security stack in much the same way the industry built a basic Web services protocol stack with XML Schemas, SOAP, WSDL, and so forth. The concept of simplifying and consolidating Web services security efforts is also a key driver behind WS-Security... WS-Security and SAML: SAML defines a standard, XML-based approach for passing security tokens defining authentication and authorization rights. SAML is almost two years old now, which means it pre-dates the recent wave of interest in Web services. Originally, SAML was more about distributed authentication and single sign-on. But those concepts are also integral to Web services, and with both SAML and WS-Security now housed at OASIS, backers of the two specs are eyeing each other warily. According to Netegrity's Chanliau, the two security approaches will work together and ultimately give enterprises some important choices about how to secure Web services. For starters, SAML has quickly adopted WS-Security as the appropriate method for "binding" SAML assertions into SOAP messages. Version 1.0 of SAML is coming up to a vote this month; it is expected to be approved as standard by fall. SAML 1.0 won't include a SAML-over-SOAP definition today -- concentrating on http instead -- but Chanliau expects future versions of the SAML spec to adopt WS-Security..." See: "Web Services Security Specification (WS-Security)."

  • [July 10, 2002] "OASIS ebXML Registry REST Interface Proposal." By Matthew MacKenzie (XML Global Technologies, Inc.) and Farrukh Najmi (Sun). Reference: pl-REST-regrep3-0.4. ['An initial proposal for the REST Interface work item for OASIS ebXML Registry V3.0. It is expected that the Federated Registries sub-team of the OASIS ebXML Registry TC will improve upon this initial proposal and then submit it for consideration by ebXML Registry TC at large.'] "This document proposes a new feature of the OASIS ebXML Registry targeted for version 3.0. REST, or REpresentational State Transfer, is an architectural style of exposing applications via the web or other URI centric transports. The key tenet of the style is the use of URIs, or in the case of http, URL's to define the actions and parameters of an interfaces invocation. REST also tends to be biased toward the http GET action, as opposed to POST or PUT, mainly because POST/PUT based applications tend to hide all of the request information in the content which is POSTed, thereby devaluing the location specificity of the URI. This document proposes a hybrid REST approach, with POST being used where GET is not practical. When the invocation parameters are too numerous or complicated, using POST is necessary, however, this is a hybrid approach because we try to still keep the URI somewhat meaningful even when performing a POST..." Also available in PDF format.

  • [July 10, 2002] "Requirements for a Web Ontology Language." W3C Working Draft 08-July-2002. Edited by Jeff Heflin (Lehigh University), Raphael Volz (FZI), and Jonathan Dale (Fujitsu). Version URL: Latest version URL: Previous version URL: The document has been produced by the W3C Web Ontology Working Group as part of the W3C Semantic Web Activity. It "specifies usage scenarios, goals and requirements for a web ontology language. An ontology formally defines a common set of terms that are used to describe and represent a domain. Ontologies can be used by automated tools to power advanced services such as more accurate Web search, intelligent software agents and knowledge management... The document motivates the need for a Web ontology language by describing six use cases. Some of these use cases are based on efforts currently underway in industry and academia, others demonstrate more long-term possibilities. The use cases are followed by design goals that describe high-level objectives and guidelines for the development of the language. These design goals will be considered when evaluating proposed features. The section on Requirements presents a set of features that should be in the language and gives motivations for those features. The Objectives section describes a list of features that might be useful for many use cases but may not necessarily be addressed by the working group..."

  • [July 10, 2002] "W3C Issues WSDL 1.2 Specification." By Tom Sullivan. In InfoWorld (July 09, 2002). "The W3C (Worldwide Web Consortium) standards body on Tuesday declared WSDL 1.2 and WSDL 1.2 Bindings as Public Working Drafts, thereby bringing them closer to final standards. Cambridge, Mass.-based W3C defines WSDL (Web Services Description Language) as an XML-based language that describes a Web service, including the data exchanged, the protocol to use, and its location on the Web. WSDL 1.2 Bindings is a description about use of WSDL 1.2 with SOAP 1.2, HTTP, and MIME. WSDL, along with SOAP (Simple Object Access Protocol), XML, and UDDI (Universal Description, Discovery, and Integration), form the core set of protocols on which industry giants have more or less agreed, and Web services currently are being constructed. According to W3C, WSDL 1.2 is easier and more flexible for developers than the previous version. The latest iteration of WSDL includes better component definition, language clarifications, a conceptual framework that defines description components, and support of the XML Schemas and XML Information Set standards. Furthermore, 1.2 removes non-interoperable features from WSDL 1.1 and more effectively works with HTTP and SOAP..." See details in the 2002-07-09 news item.

  • [July 09, 2002] [Book Announcement.] System Architecture with XML. By Berthold Daum and Udo Merten. Morgan Kaufmann Publishers, June 2002. ISBN: 1-55860-745-5. 458 pages. Information based upon text from Berthold Daum. See the associated website and online Table of Contents. Description: "XML is bringing together some fairly disparate groups into a new cultural clash: document developers trying to understand what a transaction is, database analysts getting upset because the relational model doesn't fit anymore, and web designers having to deal with schemata and rule based transformations. The key to rising above the confusion is to understand the different semantic structures that lie beneath the standards of XML, and how to model the semantics to achieve the goals of the organization. A pure architecture of XML doesn't exist yet, and it may never exist as the underlying technologies are so diverse. Still, the key to understanding how to build the new web infrastructure for electronic business lies in understanding the landscape of these new standards. If your background is in document processing, this book will show how you can use conceptual modeling to model business scenarios consisting of business objects, relationships, processes, and transactions in a document-centric way. Database designers will learn if XML is subject to relational normalization and how this fits in with the hierarchical structure of XML documents. Web designers will discover that XML puts them into a position to automatically generate visually pleasing web pages and rich multimedia shows from otherwise dry product catalogues by using XSLT and other transformation tools. Business architects will see how XML can help them to define applications that can be quickly adapted to the ever changing requirements of the market..."

  • [July 09, 2002] "RSA Bets on SAML Across Security Product Line." By Richard Karpinski. In InternetWeek (July 09, 2002). "RSA Security said it plans to support Security Assertion Markup Language (SAML) 1.0 in its Web access management platform by the end of the year. RSA will be among a group of vendors demonstrating SAML interoperability at next week's Burton Group Catalyst event. The long-awaited event will be the first public demonstration of broad interoperability of SAML implementations. SAML is an XML framework for exchanging authentication, attribute, and authorization information among different systems. Its ability to scale across different vendor systems lays the groundwork for true single sign-on capabilities that can span across an array of Web sites and applications. RSA's ClearTrust is a rules-based platform that centrally controls and manages user access privileges to Web resources. By incorporating SAML, the platform will better enable user single sign-on and allow authentication and identity information to be shared among multiple organizations and servers..." See the announcement: "RSA Security Embraces SAML Standard Across Product Lines. RSA ClearTrust Web access management software demonstrates SAML functionality at the Burton Group Catalyst Conference in San Francisco on July 15, 2002." Also: Burton Catalyst Conference 2002. On SAML: "Security Assertion Markup Language (SAML)."

  • [July 09, 2002] "Commerce One Details Upcoming 6.0 Release." By Tom Smith. In InternetWeek (July 09, 2002). "Sourcing and procurement vendor Commerce One Inc. this week filled in more details on its forthcoming Commerce One 6.0 platform, which it said will go into beta in the fourth quarter of this year. As the company has stated previously, it will add support for Web services including Universal Description, Discover and Integration (UDDI) and Simple Object Access Protocol (SOAP). UDDI is an electronic phone book or directory of systems and applications. It provides a high-level framework for companies to register themselves and their Web services. SOAP defines the envelope in which applications can deliver Web services messages and exchange data with each other; it also describes how those messages should be processed... Commerce One said adding Web services support to the Commerce One platform -- as well as the Buy and Source applications that run on it -- has required rewriting the products to support these standards, though the company said it has already offered similar functionality on a proprietary basis, before Web services technology moved to the fore... The company's internally developed Buy application, as well as the Source application that it acquired, are also being slimmed down by virtue of moving common functions -- single sign-on, security, and registry functions, for example -- into the core Commerce One platform on which those apps run. In the case of the Buy application, that will mean a reduction in code from 3.5 million lines to about 500,000 lines, Hoffman said..."

  • [July 08, 2002] "FileMaker to Push Universal Information Exchange. Upgrade Focuses on XML Data Interchange, App Integration." By Tom Sullivan. In InfoWorld (July 08, 2002). "Filemaker on Tuesday will broaden its database strategy to include XML data interchange and application integration with the release of FileMaker Pro 6. Officials from the Santa Clara, Calif.-based company described its strategy as one of universal information exchange. Expanding the core database to integrate a variety of data sources has become a common strategy among the remaining four major database players: IBM, Oracle, Microsoft, and Sybase. FileMaker, for its part, takes a slightly different tact by targeting its database server and tools at the workgroup level. To that end, the XML import and export features enable the exchange of XML data as well as application integration, according to Dominique Goupil, president of FileMaker... Volvo Action Services, Volvo's equivalent to AAA, runs a 24-hour call center operation that dispatches employees to help customers when Volvo products are in need of repair, and has been using a beta of the pending FileMaker 6... Prior to FileMaker 6, Volvo Action Services exchanged information with the client but had to parse the data first and then feed it into XML, which took much longer, Richardson explained. In addition to XML support, FileMaker 6 includes time- and cost-saving features such as batch file import, format painter, custom dialogs, and more than 20 productivity templates..."

  • [July 07, 2002] "XHTML: The Power of Two Languages. Extensible Hypertext Markup Language is a Reformulation of HTML 4 in XML" By Sathyan Munirathinam (Software Engineer, Aztec Software). From IBM developerWorks, XML zone. July 2002. ['This article takes a pragmatic look at XHTML, a markup language that effectively bridges the gap between the simplicity of HTML and the extensibility of XML. It also covers the essential features of the various flavors of XHTML and includes discussions of the language and a number of real-world applications.'] "Being a Web developer is a tough job. Not only do you have to steer clear of the traps and pitfalls that the popular browsers throw at you on a daily basis, but you also have to keep at least half an eye on the myriad developments that may (or may not) have an impact on your job. You may have just barely mastered style sheets and DHTML, yet new techniques clamor for your attention. Which ones do you need to learn right away? Which ones can you dismiss for now? Traditional HTML may ultimately be put out to pasture with the emergence of Extensible Hypertext Markup Language, or XHTML... XHTML is a hybrid of HTML and XML that's specifically designed for Net device displays (which include Web browsers, PDA devices, and cell phones). January 26, 2002 marked the second birthday of XHTML 1.0 as the official W3C recommendation for Web markup. But XHTML has yet to toddle, yet to smile, and yet to cry loud enough to get the attention of most Web designers... XHTML can be used with cascading style sheets (CSS) to achieve presentation goals. XHTML also allows you to use Extensible Stylesheet Language (XSL) with transformations. By using this XML-based style technology, you can actually transform a document from one type to another -- say, from an HTML document to a PDF document..." Note references at XHTML.ORG. See: "XHTML and 'XML-Based' HTML Modules."

  • [July 07, 2002] "XSL Transformations to and from a SAX Stream. Integrate XSL Transformations into your SAX Applications." By Nicholas Chase (President, Chase and Chase, Inc.). From IBM developerWorks, XML zone. July 2002. ['The Transformation API for XML (TrAX) simplifies the process of performing XSL transformations by creating a situation where you only need to create the source, style, and result objects, then manipulate them using a Transformer object. When sources and results are DOM nodes or files, it's easy; but what if you want to transform to or from a SAX stream? This tip shows you how to use SAX streams as both the source and destination of an XSL transformation.'] "To perform a transformation using TrAX, you need to define the source, result, and XSL stylesheet, then create a Transformer and use it to actually perform the transformation... This tip uses JAXP. The classes are also part of the Java 2 SDK 1.4, so if you have 1.4 installed, you don't need any additional software... This article demonstrates a simple application that both consumes and produces a stream of SAX events, allowing you to integrate XSL transformations into your applications directly, without the hassle of having to create intermediate files or other objects to hold your data..." See also the source files.

  • [July 05, 2002] "W3C XML Schema Design Patterns: Dealing With Change." By Dare Obasanjo. From July 03, 2002. ['One of the key challenges in XML systems is how to cope with change in documents. Business requirements and interactions change, and your documents need to change with them while retaining backwards compatibility. Our feature article this week, the first in a series from Dare Obasanjo, covers design patterns for W3C XML Schema that support the evolution of your XML documents over time.'] "W3C XML Schema is one to specify the structure of and constraints on XML documents. As usage of W3C XML Schema has grown, certain usage patterns have become common and this article, the first in a series, will tackle various aspects of the creation and usage of W3C XML Schema. This article will focus on techniques for building schemas which are flexible and which allow for change in underlying data, the schema, or both in a modular manner. Designing schemas that support data evolution is beneficial in situations where the structure of XML instances may change but still must be validated against the original schema. For example, several entities may share XML documents, the format of which changes over time, but some entities may not receive updated schemas. Or when you must ensure that older versions of an XML document can be validated by newer versions of the schema. Or, perhaps, multiple entities share XML documents that have a similar structure but in which significant domain specific differences. The address.xsd example in the W3C XML Schema Primer describes a situation in which a generic address format exists that can be extended to encompass localized address formats..." For schema description and references, see "XML Schemas."

  • [July 05, 2002] "SVG Tips and Tricks: Adobe's SVG Viewer." By Antoine Quint. From July 03, 2002. ['Antoine Quint returns this month with another installment of his SVG column, Sacré SVG! Antoine takes us on a tour of the special features particular to the most popular SVG viewer currently deployed, Adobe's SVG Viewer 3.0.'] "Since Adobe's SVG Viewer is the main SVG implementation, with 160 million deployed units and counting, I thought it would be a good idea to take a look at some special features that only the Adobe SVG Viewer offers to its users. Want to see what we can do with it? Grab a copy of version 3.0 of the viewer and we'll be off. With version 3.0, Adobe has added a significant feature to its SVG Viewer: a conformant JavaScript engine. Before version 3.0, when you were using the Adobe SVG Viewer within your Web page as viewed in Internet Explorer or Netscape Navigator, you were relying on the browser's scripting engine to evaluate and run your SVG scripting. Using Adobe's embedded scripting engine offers a few significant advantages. First and foremost it is a rock-solid, compliant implementation of the ECMA specification, and Adobe has used Mozilla's acclaimed SeaMonkey engine, opting for proven reliability. So, you can make sure that all your scripting code runs the same whether it's viewed on IE or Netscape, and whether on a Mac or a Windows box. IE and Netscape have diverging views and implementations for some of the core JavaScript functionalities (for example, the String.split() method), and they don't allow JavaScript communications between the browser and a plug-in in the same way. All of which results in cases where SVG authors had to do some browser-sniffing to serve either IE- or Netscape-specific script libraries. Although the Adobe SVG Viewer 3.0 has its own scripting engine, you can still use your browser's engine if you want. You might actually want to author SVG content for use in a specific browser, which would allow you to take full advantage of the browser's engine. Also, the Adobe engine is not turned on by default, so existing content authored prior to version 3.0 will not break. Turning Adobe's scripting engine on is fairly easy and can be done in a clean way using an attribute in Adobe's namespace... . If you want to know more, look at the set of pages devoted to ASV3 on the SVG Wiki, which is maintained by Niklas Gustavsson. Think of Adobe's work as an anticipation of what's next with SVG. And don't hesitate to voice your concerns to the W3C or to Adobe..." See: "W3C Scalable Vector Graphics (SVG)."

  • [July 05, 2002] "Sorting in XSLT." By Bob DuCharme. From July 03, 2002. ['The July 2002 "Transforming XML" column by Bob DuCharme covers sorting in XSLT.] "XSLT's xsl:sort instruction lets you sort a group of similar elements. Attributes for this element let you add details about how you want the sort done -- for example, you can sort using alphabetic or numeric ordering, sort on multiple keys, and reverse the sort order... Next month, we'll see how xsl:sort can help find the first, last, biggest, and smallest value in a document according to a certain criteria. If you can't wait, see my book XSLT Quickly, from which these columns are excerpted..." See also the ZIP file with source code. For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [July 05, 2002] "Watching TAG Again." By Kendall Grant Clark. From July 03, 2002. [Kendall Clark has taken over at the helm of the XML-Deviant column, and this week returns his gaze to the W3C's Technical Architecture Group, popularly known as TAG. Kendall gives the TAG a health check and notes they're building up quite a backlog of issues.'] "Every few months I report on the state of the W3C's Technical Architecture Group (TAG). The TAG is probably the single most influential group of web architects, if for no other reason than the scope of W3C standards work, for which, institutionally, it serves as something like a court of last appeal. In my last report, called 'Tag Watch', I reviewed the TAG's issues list and found that it was declining to address between a quarter and a third of the requests that it received, and that it had accepted but not yet resolved eight issues. Upon reviewing the public TAG materials recently, I was surprised to learn that several issues pending in April are still unresolved, including an algorithm for creating URI from a qualified name, the nature and identity of 'namespace documents' (which is partially resolved), the nature and identity of XML documents 'composed of content in mixed namespaces' (which appears to have last been formally discussed in early May), the extent of HTTP dereferencing, and, finally, the use of HTTP as a substrate for higher-level protocols. I have previously suggested that the TAG may well become a bottleneck in the development of W3C standards. While it's not yet clear whether that suggestion will be borne out, the number of issues which the TAG agrees to address continues to multiply. Since my last report, it has accepted seven new issues, raising the total from 16 to 23. Of those seven, one was pretty quickly resolved, four others have been assigned to TAG members, and two have been accepted but not yet assigned... Each time I review the TAG's progress, I come away thinking that it's some kind of very low-grade miracle that, on the good days, the Web works as well as it does. The details do in fact matter, but they don't all matter equally. The Web's fundamentals seem as sound as ever, which is a good thing since, increasingly, some of the higher levels are murky and ill-conceived. Straightening out the details, answering the tough calls, and solving the corner cases can be thankless work, but someone has to do it..."

  • [July 05, 2002] "House Makes Resolutions in XML." By Susan M. Menke. In Government Computer News (July 02, 2002). "The House of Representatives is pioneering its shift to Extensible Markup Language with simple resolutions, which started in January. 'Our goal is to begin production of some introduced bills using XML by January 2003,' said Joe Carmel, chief of legislative computer systems. Testing is now under way on XML output of new bills, he said. The House last year completed more than 100 document type definitions (DTDs) for its entire output of bills, resolutions, correspondence and other production elements... Current House output is searchable only by bill numbers or keywords, but in XML it would be searchable by titles, names, tables, subheadings and other components... A sample of the XML coding for a House bill appears at It shows how each line, name and term has an identifying tag, created by exporting the document from a word processor such as Microsoft Word or Corel WordPerfect into a special XML template. The tags automatically control typography and create entries for tables of contents and indexes. They can serve for paper or electronic publication. The House DTDs, which the Government Printing Office helped design, are in the public domain..." See: "United States Congress: XML for Legislative Documents."

  • [July 05, 2002] "XML Takes Pain Out of Contracts." By Tom Smith. In InternetWeek (July 05, 2002). "Two early users of an XML-based transaction and document exchange system are achieving significant cost and cycle-time reductions in executing contracts with customers. The State of Utah's Department of Community and Economic Development and software developer Novell Inc. are in the early stages of deploying NxLight Inc.'s NxLight Transaction Network. The State of Utah has deployed the system to cut the time required to process contracts with companies that are receiving economic development incentives. In Novell's case, the company is using the system to manage the thousands of boilerplate contracts it executes with K-to-12 schools. Today, neither company has integrated the contracts it's managing into other corporate systems, but both indicated that the system's use of XML as the means of packaging and delivering information will make it easy to integrate contracting activities into financial systems, customer relationship management (CRM), and even enterprise resource planning (ERP) systems. Their efforts with the NxLight Transaction Network address a number of business processes that are core elements of supply-chain strategy: customer service, contract management, and contract negotiation, to name a few... Utah's Department of Community and Economic Development/Business and Economic Development Division has deployed NxLight's system on a hosted basis to accelerate execution of contracts with employers that apply for state incentives that reward their job creation results... NxLight's system builds and transmits what the company refers to as smart packages -- XML-based documents with embedded workflow rules and business logic. It also includes templates for common business processes. A number of security options are available but, in a nutshell, companies have the ability to use digital certificates to validate the identity of those signing contracts. A business partner only needs a browser to participate in NxLight transactions. The State of Utah is using NxLight on a hosted basis. With the software in place, the state approves the application for incentive funds, and notifies the company by e-mail that a contract is now available for review. The company is given an account to log in to NxLight. The company can access and print the contract, but cannot make changes. The state has exerted change control so that only its employees can revise a contract. In a typical scenario, Roberts will collaborate by phone with the company, make necessary changes, save those changes, and then let the company see the updated document in real time... Roberts said the state selected NxLight in large measure for its workflow functions, its ability to create an audit trail of all activities with the document, and the ability to support digital signatures. "Looking to the future, we're going to have to make this jive with state finances, state archives, and different departments," Roberts said, and XML should facilitate that integration..."

  • [July 05, 2002] "EveryPath Brings Order to Mobile Computing. [Interview.]" By Michael Vizard. In InfoWorld (July 03, 2002). [EveryPath CEO Mark Tapling and CTO Prakash Iyer explain the benefits of an XML-based gateway to enterprise applications.'] "Managing multiple mobile computing platforms is rapidly turning into a nightmare for IT organizations. One company that hopes to bring some order to the chaos is EveryPath, which has developed an XML-based Enterprise Mobile Application Gateway. In an interview with InfoWorld Editor in Chief Michael Vizard, company CEO Mark Tapling and CTO Prakash Iyer explain why EveryPath thinks an XML-based gateway to enterprise applications is the most flexible, robust approach to mobile computing in the enterprise... We have an XML API layer, so we have the ability of building on top of connectors and toolkits. The first product we've released is a connector for the Siebel application, so it has the ability of selectively extracting subsets from the Business Object layer of Siebel... The XML API is kind of the standardized interface between your back end and our server. All you have is just read-only data that you want to display. But when you're looking at something like Siebel, where you have to do conflict resolution, transactional applications, then the complexity is based on that. It doesn't really matter whether you're building in this infrastructure or any other infrastructure, you have some added costs [when building] in that transactional data integrity and conflict resolution and all of that stuff..."

  • [July 03, 2002] "XML Watch: Finding Friends With XML and RDF. The Friend-of-a-Friend vocabulary can make it easier to manage online communities." By Edd Dumbill (Editor and publisher, From IBM developerWorks, XML Zone. June 2002. ['Edd Dumbill explores an XML and RDF application known as Friend-of-a-Friend (FOAF). FOAF allows the expression of personal information and relationships, and is a useful building block for creating information systems that support online communities. Code samples demonstrate the basics.'] "RSS creates a predictable way for scraps of content to be aggregated, sequenced, and searched. Sites like Syndic8 and Meerkat enable you to keep track of who's saying what about your topic -- or person -- of interest. RSS is also pretty simple, and you can often find it used in examples in XML tutorial material. Part of its appeal is the way you can connect your content to the larger Web, enabling others to find you more easily... FOAF is simply an RDF vocabulary. Its typical use is akin to that of RSS: You create one or more FOAF files on your Web server and share the URLs so software can use the information inside the file. Like creating your own Web pages, the creation of your FOAF data is decentralized and within your control. An example application that uses these files might be a community directory where members maintain their own records. However, as with RSS, the really interesting parts of FOAF come into play when the data is aggregated and can then be explored and cross-linked. FOAF has the potential to become an important tool in managing communities. In addition to providing simple directory services, you could use information from FOAF in many ways. For example: (1) Augment e-mail filtering by prioritizing mails from trusted colleagues; (2) Provide assistance to new entrants in a community; (3) Locate people with interests similar to yours. The rest of this article describes FOAF's basic features and gives some pointers to current implementations and future considerations..." See "RDF Site Summary (RSS)."

  • [July 03, 2002] "Microsoft Releases Visual J# .Net." By Darryl K. Taft. In eWEEK (July 01, 2002). "Microsoft Corp. has announced the availability of its Visual J# .Net tool at Microsoft TechEd Europe in Barcelona, Spain. Visual J# .Net is a Java development tool for building applications and XML Web services on the .Net Framework. Its release rounds out Microsoft's language offerings within its Visual Studio .Net platform, including Visual C++ .Net, Visual C# .Net and Visual Basic. Net. The new product features tight integration with Visual Studio .Net; integration with the .Net Framework, including cross-language integration; and tools to upgrade existing Visual J++ 6.0--which Visual J# .Net replaces -- to the new Visual Studio .Net format, said Tony Goodhew, Microsoft's product manager for Visual J# .Net. Goodhew said several enterprise customers have expressed interest in developing with Visual J# .Net, with organizations such as Alibre, Kana and the Wedding Channel developing applications with it... In addition, Goodhew said Visual J# .Net provides for an easy transition for Java-language developers into the world of XML Web services. He also said it improves interoperability between Java programs with existing software written in other languages because of its tight integration with the .Net Platform and its Common Language Runtime. However, in its press announcement for the product, Microsoft notes: 'Visual J# .NET has been independently developed by Microsoft and is the property of Microsoft Corp. It is not endorsed or approved in any way by Sun Microsystems Inc.'..." See the announcement: "Microsoft Rounds Out Developer Languages With Launch of Visual J# .NET. Industry's First Java-Language Tool With Native XML Web Services Support Provides Increased Productivity, Smoother Integration for Customers ."

  • [July 02, 2002] "SAML Promises Web Services Security." By James Kobielus. In Network World (July 01, 2002). "Security Assertion Markup Language 1.0 is a new proposed standard for interoperability among Web services security products. As corporations increasingly deploy access management solutions and other security products in Web services environments, SAML 1.0 has the potential to be a critical interoperability standard for securing these online environments from end to end, both within organizations and from business to business. SAML 1.0, nearing ratification by the Organization for the Advancement of Structured Information Standards, works with XML and Simple Object Access Protocol (SOAP). SAML 1.0 defines SOAP-based interactions among security and policy domains, supporting Web single sign-on (SSO), authentication and authorization. The standard defines request and response 'assertion' messages that security domains exchange to vouch for authentication decisions, authorization decisions, and attributes that pertain to named users and resources. SAML 1.0 also defines functional entities such as authentication authorities, attribute authorities, policy decision points and policy enforcement points. In a SAML-enabled Web SSO scenario, users log on to their home or 'source' domains through authentication techniques such as ID/password. The source domain communicates this authentication decision, plus other information that provides a security context for that decision, to one or more affiliated or federated destination domains through messages that contain SAML 'authentication assertions' and 'attribute assertions.' See also "SAML Gains Steam." References: "Security Assertion Markup Language (SAML)."

  • [July 02, 2002] "Business Process Specification Hits Draft Form." By Carolyn A. April. In (July 01, 2002). "The Business Process Management Initiative (BPMI) has made available the first public draft of a specification aimed at providing a standard way to model business processes across heterogeneous systems both inside and beyond the firewall. BPML (Business Process Modeling Language) 1.0 is an XML Schema that defines a formal model for expressing business processes that represent a range of enterprise activities including transactions, data management, concurrency, exception handling, and operational semantics, according to officials at The specification is intended to work in concert with the recently proposed WSCI spec, which would allow developers to "choreograph" events and transactions that take place between systems when applications and services are accessed over the Internet. BPML 1.0's role would be to model business processes into standard, private implementations that sport WSCI-defined public interfaces, officials said... Burlingame, Calif.-based Intalio is one of 16 founding members of BPMI, which also include such companies as Aventail, in Seattle; VerticalNet, in Horsham, Pa.; Bowstreet, in Portsmouth, N.H.; and Verve, in San Francisco..." See the news item " Releases Business Process Modeling Language Working Draft Specification."

  • [July 02, 2002] "Still No Universal Workflow." By James Kobielus. In Network World (July 01, 2002). "Workflow technologies are everywhere, having been embedded in a range of development tools, network applications and Web services. Workflow standards are everywhere, too, but they never seem to jump the gap from hopeful press releases to broad adoption. So it's with considerable skepticism that we should greet the recent announcement that the Workflow Management Coalition (WfMC) and the Business Process Management Institute (BPMI) have agreed to converge their efforts to define XML-based workflow-process definition standards. Potentially, the alliance could bring WfMC's XML Process Definition Language (XPDL) and BPMI's Business Process Markup Language (BPML) under a common standards initiative. This alliance looks good on paper, but it doesn't bring the workflow industry much closer to its longtime goal of defining universal standards and services that span all vendors and applications. The core problem is that there are too many workflow standards. More to the point, no new workflow standard ever seems to gain the industry momentum necessary to push others to extinction. The WfMC/BPMI alliance says nothing about whether other groups will converge their specifications into a common industry standard. Rival workflow-process definition standards include the Object Management Group's Unified Modeling Language, the Organization for the Advancement of Structured Information Standards' ebXML Business Process Schema, the World Wide Web Consortium's Web Services Conversation Language and RosettaNet's Partner Interface Process... Where standards are concerned, the workflow industry's best hope is that a sufficient number of platform vendors get religion soon about the need for general-purpose workflow services that span application categories. Universal implementation of workflow standards won't take place until Microsoft supports one or another open standard..."

  • [July 02, 2002] "Is XML Too Prolific? General Services Administration study to advise on need for government repository." By William Matthews. In Federal Computer Week (July 01, 2002). "The beauty of Extensible Markup Language is that it's universal and can make information understandable to dissimilar computer systems. But an ugly side of XML threatens to emerge. Many versions of it are being developed, and their differences threaten the very universality that makes XML so attractive in the first place. There's fear that divergent XML dialects may already be developing in different government agencies. If allowed to flourish, these dialects could re-create the very problem XML was intended to solve -- too many languages that make disparate computer systems unable to easily communicate with one another. Last month, the General Services Administration hired consulting firm Booz Allen Hamilton to advise the federal government on whether it needs to create a repository of government XML data structures. The answer will almost certainly be yes, XML experts say. The repository would contain XML elements that represent the agreed-upon standards for how particular pieces of digital data should be labeled in XML. GSA and Booz Allen expect to issue an interim recommendation on the repository late this month and a final recommendation in October, according to Marion Royal, GSA's expert on XML. When creating digital documents, templates or other data, government authors and developers would consult the repository to find the proper data tags. If none exist, they would create them and submit those creations to the repository for later use by other authors. Thus the repository would keep XML from devolving into multiple, incompatible idioms..." See also: (1) "US General Accounting Office Releases XML Interoperability Report"; (2) "GSA's Government Without Boundaries Project (GwoB) Publishes Draft XML Schemas"; (3) "DISA to Establish a US DoD XML Registry and Clearinghouse." General references in: "US Federal CIO Council XML Working Group."

  • [July 02, 2002] "Directory Assistance For Web Services?" By Wylie Wong, Mike Ricciuti, and Larry Dignan. In CNet (July 01, 2002). "A project to build a giant Web services directory touted by the likes of Microsoft and IBM has yet to catch on, as companies stumble over technology hurdles and come to grips with market hype. Born during the business-to-business e-commerce craze, the directory project was touted as a "Yellow Pages" for e-commerce applications and services. The basis of the directory is a Web services specification called Universal Description, Discovery and Integration (UDDI), which identifies and catalogs Web services so they can be easily found online. The specification is finding a home, albeit slowly, in big companies as a way to build directories for internal Web services projects -- allowing the companies to better catalog and communicate services across departments. Yet the bigger dream to build the UDDI-based public Web services directory, known as the UDDI Business Registry, is just that for now -- a dream, experts say... Microsoft, IBM and Ariba proposed the directory plan nearly two years ago. The idea was to build an online database using UDDI that would help companies find Web-based software services that could be used as part of their own business systems. In theory, companies registered on the directory could use it as a sort of Yahoo for business services. For example, an e-commerce site could use the directory to search for a business that handles credit card transaction services. If a match were found, all the elements of the transaction--even the price and payment -- could be handled electronically. Pushing all the communications onto the Web would streamline the process and speed business considerably, proponents have said. Microsoft, IBM, SAP and Hewlett-Packard -- some of the same companies that helped devise the UDDI specification -- have built Web sites for registering Web services in the public UDDI directory database. Japan's NTT Communications has a similar Web site in the works... Building a service, brick by brick UDDI is one of four technologies that form the basis of most Web services development. The others are Extensible Markup Language (XML), the Simple Object Access Protocol (SOAP) and Web Services Description Language (WSDL). To understand how these protocols work together, consider an ordinary phone call. In Web services parlance, XML represents the conversation, SOAP describes the rules for how to call someone, and UDDI is the phone book. WSDL, finally, describes what the phone call is about and how one can participate... It could take four to five years before the public UDDI registry fully lives up to its promise, analysts say. One of the main issues preventing a public directory is trust. Despite the ease of finding companies on the directory, very few businesses would be willing to take a chance on an unknown Web service provider..." See: "Universal Description, Discovery, and Integration (UDDI)."

  • [July 02, 2002] "Breathing New Life Into UDDI." By Adrian Mello. In ZDNet Tech Update (June 26, 2002). "Most technologists say Web services will define the next phase of the Internet. But how do you know which services are available? Where can you find them? And who supplies them? These are questions that the Universal, Description, Discovery, and Integration (UDDI) technology was supposed to answer. Unfortunately, enthusiasm for UDDI has dampened considerably since the standard's creation a couple of years ago. "UDDI has basically fallen off the radarscope of most organizations looking to implement more advanced forms of business-to-business functionality," says Ken Vollmer, research director with the Giga Information Group. Validation and security problems are mostly to blame. Public UDDI registries contain a lot of useless listings -- some businesses in the directory don't exist, others link to the wrong sites. Many businesses listed in the directory have no plans to offer Web services or simply don't understand how to use the directory. Other businesses don't even remember registering in the first place. The diluted quality of the public registry makes it a poor bet for enterprises in search of solid services from reliable suppliers... One company (E2open) is attempting to solve these problems by creating the first private UDDI directory. E2open, which operates a consortia e-market serving major electronics companies, also runs a private directory -- called the E2open Process Directory (E2PD). This private directory contains information such as trading partner profiles, IP addresses, services, and business processes. E2open's members--which include Mitsubishi Electric, Ricoh, Sanyo, and Sharp -- use the directory as an integration hub and to find new suppliers based on their ability to provide available parts and meet service-level requirements... E2PD helps convert a wide variety of data formats and protocols for more than 2,000 trading partners. For example, the directory can convert a purchase order a buyer sends in RosettaNet format to the EDI format used by a supplier. This ability to convert data formats and protocols not only simplifies integration among large enterprises, but it also helps them enlist smaller suppliers that typically can't afford to handle complex integration tasks. (Small suppliers often handle orders solely by e-mail.) As small suppliers become more sophisticated and adopt standards such as RosettaNet, E2PD permits them to easily update formats..."

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: