The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: December 02, 2002
XML Articles and Papers October 2002

XML General Articles and Papers: Surveys, Overviews, Presentations, Introductions, Announcements

Other collections with references to general and technical publications on XML:

October 2002

  • [October 31, 2002] "Analyze Schemas with the XML Schema Infoset Model." By Shane Curcuru. From DevX XML Zone. October 2002. ['IBM's new XML Schema Infoset Model provides a complete modeling of schemas themselves, including the concrete representations as well as the abstract relationships within a schema or a set of schemas. Learn how to use this powerful library to perform complex queries on your own schemas.'] "As the use of schemas grows, so does the need for tools to manipulate those schemas. IBM's new XML Schema Infoset Model provides a complete modeling of schemas themselves, including the concrete representations as well as the abstract relationships within a schema or set of schemas. This library easily queries the model of a schema for detailed information. You can also use it to update the schema to fix any problems found and write the schema back out. Although there are a number of parsers and tools that use schemas to validate or analyze XML documents, tools that allow querying and advanced manipulation of schema documents themselves are still being built. The XML Schema Infoset Model (AKA the Java packages org.eclipse.xsd.*, or just 'the library') provides a rich API library that models schemas -- both their concrete representations (perhaps in a schema.xsd file) and the abstract concepts in a schema as defined by the specification. As anyone who has read the schema specs knows, they are quite detailed. The XML Schema Infoset Model strives to expose all the Infoset details within any schema. This allows you to efficiently manage your schema collection, and empower higher-level schema tools such as schema-aware parsers and transformers... The XML Schema Infoset Model also includes the UML diagrams used in building the library interfaces themselves; these diagrams show the relationships between the library objects, which very closely mimic the concepts in the schema specifications..." Note: The IBM XML Schema Infoset Model "is a reference library for use with any code that examines, creates, or modifies XML Schemas (standalone or as part of other artifacts, such as XForms or WSDL documents." On October 23, 2002 IBM released a downloadable Version 1.0.1 stable build (20021023_1900TL); see the Developer FAQ and complete documentation. The earlier news item: "IBM Publishes XML Schema Infoset API Requirements and Development Code." General references in "XML Schemas."

  • [October 31, 2002] "DAML+OIL: An Ontology Language for the Semantic Web." By Deborah L. McGuinness (Stanford University), Richard Fikes (Stanford University), James Hendler (University of Maryland), and Lynn Andrea Stein (Franklin W. Olin College of Engineering). In IEEE Intelligent Systems [ISSN: 1094-7167] Volume 17, Number 5 (September/October 2002), pages 72-80. "By all measures, the Web is enormous and growing at a staggering rate. This growth has made it both increasingly difficult and increasingly important for humans and programs to quickly and accurately access Web information and services. A semantic Web-in which meanings of terms are captured and exploited-can provide the foundation for convenient Web content access. The DARPA Agent Markup Language (DAML) program aims to provide a language and toolset that enables the Web to transform from a platform that focuses on presenting information to a platform that focuses on understanding and reasoning with information. In this article, we describe the DAML language; its goal is to capture term meanings, and thereby providing a Web ontology language. In addition to a brief history of the language's evolution, we introduce the ontology language DAML+OIL by way of examples and include an axiomatization of the language... We're developing DAML+OIL in stages. Our initial aim was to capture term descriptions, as we've described here. The DAML program is now working on a query language and the integration of a rule-encoding option. The next major language enhancement, DAML-Logic (DAMLL) will address encoding inference and general implications. The DAML Services group also built a Web service ontology, DAML-S, which uses DAML+OIL to provide a foundation for Web agents. DAML+OIL was submitted as the starting point for the W3C Semantic Web Activity's OWL. The W3C's Web Ontology Working Group has produced a set of requirements for OWL, as motivated by a collection of use cases. DAML+OIL meets the current requirements draft reasonably well, and the initial OWL language description is quite similar to DAML+OIL. We believe that DAML+OIL is a useful starting point for describing Web content, building on decades of research in framebased systems, description logics, and Web languages. Given this foundation, and the research benefits into languages, complexity, and usability it provides, DAML+OIL could serve as a sound foundation for the next evolution of Web access. Researchers have already accepted DAML+OIL as a starting point for Web semantics representation and used it for applications ranging from military intelligence to medical and genetic database integration. Among the current development efforts are those focusing on using DAML+OIL for managing large Web sites and document and image collections, integrating disparate databases, and providing Web services' interoperability..." General references in "DARPA Agent Mark Up Language (DAML)" and "Ontology Interchange Language (OIL)." Related topics in "Markup Languages and Semantics."

  • [October 31, 2002] "Web Ontology Language (OWL) Test Cases." W3C Working Draft 24-October-2002. Edited by Jeremy J. Carroll (HP) and Jos De Roo (AGFA). Latest version URL: http://www.w3.org/TR/owl-test/. Produced by the members of the W3C Web-Ontology (WebOnt) Working Group. Part of the W3C Semantic Web Activity. The WD document "contains and presents test cases for the Web Ontology Language (OWL) approved by the Web Ontology Working Group. Many of the test cases illustrate the correct usage of the Web Ontology Language (OWL), and the formal meaning of its constructs. Other test cases illustrate the resolution of issues considered by the working group... OWL is used to publish and share sets of terms called ontologies, providing accurate Web search, intelligent software agents, and knowledge management... it facilitates greater machine readability of web content than XML, RDF, and RDF-S support by providing a additional vocabulary for term descriptions. The OWL Web Ontology Language is being designed by the W3C Web Ontology Working Group as a revision of the DAML+OIL web ontology language." This document is subsidiary to the other Web Ontology Language recommendation track documents, including OWL Web Ontology Language 1.0 Reference [W3C Working Draft 29 July 2002], OWL Web Ontology Language 1.0 Abstract Syntax [W3C Working Draft 29 July 2002], and OWL Formal Semantics. For an introduction to OWL, see Feature Synopsis for OWL Lite and OWL [W3C Working Draft 29 July 2002]. See "OWL Web Ontology Language."

  • [October 31, 2002] "Ontologies Come of Age." By Deborah L. McGuinness (Associate Director and Senior Research Scientist, Knowledge Systems Laboratory, Stanford University, Stanford, CA, USA). [WWW] Published in Spinning the Semantic Web: Bringing the World Wide Web to Its Full Potential [edited by Dieter Fensel, Jim Hendler, Henry Lieberman, and Wolfgang Wahlster; MIT Press, 2002]. "Ontologies have moved beyond the domains of library science, philosophy, and knowledge representation. They are now the concerns of marketing departments, CEOs, and mainstream business. Research analyst companies such as Forrester Research report on the critical roles of ontologies in support of browsing and search for e-commerce and in support of interoperability for facilitation of knowledge management and configuration. One now sees ontologies used as central controlled vocabularies that are integrated into catalogues, databases, web publications, knowledge management applications, etc. Large ontologies are essential components in many online applications including search (such as Yahoo and Lycos), e-commerce (such as Amazon and eBay), configuration (such as Dell and PC-Order), etc. One also sees ontologies that have long life spans, sometimes in multiple projects (such as UMLS, SIC codes, etc.). Such diverse usage generates many implications for ontology environments. In this paper, we will discuss ontologies and requirements in their current instantiations on the web today. We will describe some desirable properties of ontologies. We will also discuss how both simple and complex ontologies are being and may be used to support varied applications. We will conclude with a discussion of emerging trends in ontologies and their environments and briefly mention our evolving ontology evolution environment..." See related topics in "Markup Languages and Semantics." General references in "XML and 'The Semantic Web'."

  • [October 31, 2002] "Microsoft .NET Speech SDK 1.0 Beta 2: New Features and Enhancements." Microsoft Corp. Product Fact Sheet. October 2002. 4 pages. "The Microsoft .NET Speech SDK is a set of ASP.NET controls, a Microsoft Internet Explorer Speech Add-in, sample applications and documentation that allows Web developers to create, debug and deploy speech-enabled ASP.NET applications. The tools are integrated seamlessly into Microsoft Visual Studio .NET, allowing developers to leverage the familiar development environment. Microsoft Corp.'s approach to speech-enabled Web applications is built around a newly emerging standard: Speech Application Language Tags (SALT). SALT has been submitted to the World Wide Web Consortium (W3C) for adoption as a standard for telephony and multimodal applications, which incorporate speech-enabled elements within a visual Web interface. The beta 2 release of the Microsoft .NET Speech SDK includes a complete tool set for creating and testing SALT-based voice-only telephony applications. It also supports the development of multimodal speech applications on clients such as desktop PCs or Tablet PCs using Internet Explorer browser software... The Microsoft .NET Speech SDK supports SALT 1.0. SALT extends HTML and other markup languages with tags and scriptable objects to perform voice output, spoken-language input, telephony management and messaging. With the new speech add-in for Internet Explorer, developers can type SALT code directly into the browser. [The SDK ] includes a set of robust grammar libraries, written in the W3C-approved Speech Recognition Grammar Specification (SRGS) format, is included in the SDK, which helps developers obtain abstract, complex concepts from the user. For example, gathering recognizable date or time input is quite complex. If the user says 'Thanksgiving' or 'Monday the 5th,' applications can use grammar libraries to make an intelligent determination of the exact date and year... In beta 1 of the SDK, Microsoft implemented its own grammar format. With this release, the Grammar Editor now opens and saves files only in the W3C-approved SRGS format. All elements, properties and validations have been updated for SRGS..." See: (1) the associated announcement; (2) "Intel and Microsoft Collaborate on SALT"; (3) "SALT Forum Contributes Speech Application Language Tags Specification to W3C"; (4) general references in "Speech Application Language Tags (SALT)." [source .DOC]

  • [October 31, 2002] "Microsoft Preps .Net Speech Platform." By Mary Jo Foley. In eWEEK (October 30, 2002). On Wednesday [2002-10-30], Microsoft Corp. took another step toward delivering its .Net platform that will allow developers, customers and integration partners to write and deploy speech-enabled applications and Web services. Microsoft announced at the SpeechTEK Expo here an early beta, or technical preview, of its .Net Speech Platform, which it will make available to fewer than 100 selected testers participating in the company's Joint Development Program. The official Beta 1 release of the platform is due out by next summer, with the final .Net Speech Platform offering due by the end of 2003. The .Net Speech Platform -- a bunch of technologies running on top of Windows Server -- includes a Microsoft speech-recognition engine, a text-to-speech engine, the SALT (Speech Application Language Tags) interpreter, a SALT browser and a telephony interface, among other elements. Microsoft's own product groups are dabbling with the .Net Speech Platform, said James Mastan, director of marketing for .Net speech technologies. He declined to offer specifics, but confirmed that business-to-consumer speech applications might be of interest to the MSN team, while business-to-employee, voice-activated, self-service applications might appeal to the Business Solutions unit that oversees MSCRM, Great Plains and Navision products. Microsoft also announced at SpeechTEK the Beta 2 release of the .Net Speech software development kit. The SDK can integrate directly into Visual Studio .Net and provide developers with pre-built speech components they can use to develop multimodal (a k a, speech plus visual) speech applications that can run on desktop PCs, the soon-to-be-released Tablet PCs and any other device running Internet Explorer. The Beta 2 release includes support for the W3C's SALT 1.0 specification, as well as support for SALT-based, voice-only telephony applications. Microsoft officials said they expect the final .Net Speech SDK to ship by mid-2003..." See references in the preceding bibliographic entry.

  • [October 31, 2002] "Create Flexible and Extensible XML Schemas. Building XML Schemas in an Object-Oriented Framework." By Ayesha Malik (Senior Consultant, Object Machines). From IBM developerWorks, XML zone. October 2002. ['XML schemas offer a powerful set of tools for constraining and formalizing the vocabulary and grammar of XML documents. With XML rapidly emerging as the data transport format of the future, it is clear that the structure of the XML, as outlined by schemas, must be created and stored in an organized manner. Developers experienced in object-oriented design know that a flexible architecture ensures consistency throughout the system and helps to accommodate growth and change. This instructional article uses an object-oriented framework to show you how to design XML schemas that are extensible, flexible, and modular.'] "When leveraging established patterns of object-oriented programming in constructing XML schemas, I use the three main principles of object-oriented design: encapsulation, inheritance, and polymorphism. To help discuss object-oriented frameworks in this context, I use an example of a fictitious company, Bond Publishing... Design patterns for decoupling: Recently, some design patterns have emerged that address decoupling and cohesiveness in XML schemas. We have already discussed how to create reusable components. Now, you'll learn how to vary the granularity of datatypes. This is similar to trying to answer the question 'How can I refactor my code and how much refactoring is appropriate for a given situation?' There are currently three design patterns that represent three levels of granularity when creating components... Object-oriented programming places a great deal of emphasis on packaging classes according to their services. The package structure organizes the code and facilitates modularity and maintenance. You can achieve similar benefits by organizing your XML schemas according to their functions... If your system is going to use XML to transport data information, either internally or externally, then you should seriously consider how to properly design your XML schemas. In this article, you have seen how to create schemas that use inheritance, encapsulation and polymorphism, and even had a glimpse of emerging design patterns in XML schema design. Leveraging these object-oriented frameworks helps you design XML schemas that are modular and extensible, maintain data integrity, and can be easily integrated with other XML protocols..." Related references in "XML Schemas."

  • [October 31, 2002] "Creating Web Services with AXIS. Apache's Latest SOAP Implementation Bootstraps Web Services." By Martin Streicher. In Linux Magazine ('August 2002'). "Apache AXIS is a substantial and comprehensive open source Java (and eventually C++) toolkit for building and deploying Web service clients and servers. Based on standards (HTTP, the Simple Object Access Protocol (SOAP), Web Services Description Language (WSDL), and XML), AXIS includes APIs, tools, and lots of sample code that you'll find invaluable whether you're deploying your first simple Web service, a full-blown commercial service, or a Java applet that interacts with another vendor's Web service. For example, if you're developing a Web service client, you'll need to know how to communicate with a remote Web service. Specifically, you'll need to know the service's URL, its service and method names, and the types and number of parameters for each method. In the realm of Web services, all of that information is captured in a service's WSDL file. AXIS offers a tool, called WSDL2java, that interprets WSDL files and emits Java code that encapsulates all Web service intercommunication. Where you previously wrote tens of lines of code to send SOAP messages manually with Apache SOAP, AXIS can reduce the effort to invoke a remote procedure to just two or three calls to create and initialize two objects. On the other hand, if you're developing the server code for a new Web service, AXIS can help there, too. As mentioned above, all Web services describe themselves with WSDL files. Rather than write WSDL files from scratch (which have to accurately reflect the public methods of your Java classes), AXIS's java2WSDL tool can generate WSDL files directly from Java source code... Finally, AXIS makes deploying and managing Web services a snap. The fastest way to create an AXIS Web service is to simply drop a Java source file into the AXIS Web applications directory. The other technique, Web Services Deployment Descriptors (WSDD), are about as easy to use, but give you more control and more flexibility. For example, WSDD files can enable or disable individual methods in your Web service. AXIS also offers a number of system administration tools that make management of Web services more tractable. Services can be deployed and un-deployed using AXIS's AdminClient tool, and to help others consume your Web services, AXIS automatically generates WSDL files from any service deployed on your site..."

  • [October 30, 2002] "Microsoft Unveils .Net Speech Platform." By Ephraim Schwartz. In InfoWorld (October 30, 2002). "Microsoft unveiled Wednesday the first technical preview of its .Net Speech Platform and also announced availability of the second beta release of its .Net Speech Software Developer Kit (SDK) at the Speech TEK conference in New York. As unveiled, the speech platform contains the Microsoft speech recognition engine, the middleware to connect into a telephony system, the SALT (Speech Application Language Tags) interpreter, a SALT voice browser, and the SpeechWorks text-to-speech engine. The .Net Speech Platform will give developers and customers a foundation to design a single application that can run a speech-enabled application in a variety of venues, including telephony, desktops, and in a multimodal format on mobile devices. The platform is expected to enter beta testing next summer and ship by the end of 2003. The integration also allows an application that incorporates speech to run on any Web server running Microsoft ASP.Net (Active Server Pages), according to [Microsoft's] Mastan. The kit's toolset includes the grammar development tool, a prompt creation tool for creating and managing prompts, a debugging tool, the ASP.Net SALT Web controls, and the add-ins for Internet Explorer for multimodal clients. The SDK will also include a tool for creating and testing SALT-based telephony-only applications, and will use the W3C (World Wide Web Consortium) formats for speech grammar... Recent Microsoft partnerships with Intervoice, a leading IVR (Interactive Voice Response) designer, and Intel, a supplier of Dialogic speech boards for telephony, also indicate that Microsoft's speech technology is moving toward mainstream acceptance..." See: (1) the announcement: "Microsoft Releases New Features, Enhancements to .NET Speech SDK. Announces Technical Preview of .NET Speech Platform And Joint Development Program. Ongoing Product Development Efforts Ready Market for Mainstream Adoption of SALT-Based Applications and Solutions."; (2) "Intel and Microsoft Collaborate on SALT"; (3) "SALT Forum Contributes Speech Application Language Tags Specification to W3C"; (4) "Speech Application Language Tags (SALT)."

  • [October 30, 2002] "Character Entities: An XML Core WG View." By [Paul Grosso] for the W3C XML Core Working Group. 'A consensus statement from the XML Core WG as of 2002-October-23.' 2002-10-23 or later. "Character entities is an informal name for XML internal general entities (whether internally or externally declared) that provide a name for a single Unicode character. Character references, whether decimal or hexadecimal, offer the same power as character entities, but not the same ease of use. Therefore the ability to use character entities is recognized as important. However, there is absolutely no need to introduce a new mechanism into XML to declare them. ... character entities are important [but] there is absolutely no need to introduce a new mechanism into XML to declare them. The existing mechanism, DTDs, is entirely adequate to the purpose. Although some subsets of XML have outlawed DTDs in the name of interoperability, all conforming XML processors (parsers) must be able to recognize at least some DTD information, specifically including the declaration of character entities in the internal subset. In addition, all but the most limited XML processors can also process the external DTD subset at least to the extent of being able to recognize and act on character entity declarations. At worst, then, the character entities actually used in a given document (generally a small subset of those available) can be declared in the internal subset, and are 100% interoperable across processors... People have sometimes asked for a more general character naming mechanism, equivalent in power to SGML SDATA declarations, allowing for the use of characters that are not encoded in Unicode (either by policy or because the encoding effort has not yet reached them). There is no need for such a facility, because of the Unicode Private Use Area (PUA). This provides a supply of 6400 + 65534 * 2 characters, far more than any application will need..." Note of 2002-10-30 from John Cowan on XML-DEV: "In light of ongoing discussions relating to "character entities" including suggestions that a future version of XML should be augmented to include some new syntax to accommodate them, the XML Core Working Group has developed a public statement describing the current consensus within the WG on certain aspects of this topic..."

  • [October 30, 2002] "Developing Grid Computing Applications, Part 1. Introduction of a Grid Architecture and Toolkit for Building Grid Solutions." By Liang-Jie Zhang, Jen-Yao Chung, and Qun Zhou (IBM). From IBM developerWorks, Web services. October 2002. ['According to Gartner, many businesses will be completely transformed over the next decade by using Grid-enabled Web services to integrate across the Internet to share not only applications but also computer power. In this article, Liang-Jie Zhang, Jen-Yao Chung, and Qun Zhou from IBM introduce developers to the basic idea of Grid computing and the Open Grid Services Architecture (OGSA). They describe how developers can use the latest Globus Toolkit (Open Grid Services Infrastructure technology preview) to discover a Grid service, create a Grid service interface, and invoke a Grid service instance. Some ideas to help developers integrate Web services and Grid computing are also described.'] "The driving force behind Grid technology standardization is the Global Grid Forum. The integration of Grid and Web services is technically complicated, but natural. The GGF is a community-initiated forum of individual researchers and practitioners who work on distributed computing, or 'grid' technologies. GGF is the result of a merger of the Grid Forum, the eGrid European Grid Forum, and the Grid community in the Asia Pacific. GGF efforts are also aimed at the development of a broadly based Integrated Grid Architecture that can serve to guide the research, development, and deployment activities of the emerging Grid communities. The Open Grid Service Interface Working Group of the GGF is defining the Open Grid Services Architecture (OGSA). OGSA is a distributed interaction and computing architecture based around the Grid service to assure interoperability on heterogeneous systems so that different types of systems can communicate and share information. It leverages the emerging Web services to define the Web Services Definition Language interfaces. All services adhere to specified Grid service interfaces and behaviors. The Grid can be defined at three levels: (1) Enterprise (Enterprise Grid); (2) Partner (Partner Grid); (3) Service (Service Grid). However, the following components in the OGSA specification are still in the early stages of development: Factory, Registry, Discovery, Lifecycle, Query service data, Notification, and Reliable invocation. On the other hand, OGSA is a model for system composition to perform a specific task or solve a challenging problem by using distributed resources over the network... Conclusion: Grid computing uses the Internet to connect clusters of computers or business processes into the force of one single 'supercomputer.' Eventually, the Internet will become a single, unified computing platform, providing faster access to infrastructure and other business application resources. This is the era of Service Computing. In this article, we have introduced developers to Service Computing. We have shown how to use the latest Globus Toolkit to discover a Grid service, create a Grid service interface, and invoke a Grid service instance. Some ideas to help developers integrate Web services and Grid computing have also been described... The second installment of this series will focus on the Grid solution creation process including Grid architecture design, Grid service development, and Grid service deployment." See: "Web Services Description Language (WSDL)."

  • [October 30, 2002] "Building Multi-Platform User Interfaces with UIML." By Mir Farooq Ali, Manuel A. Pérez-Quiñones, Marc Abrams, and Eric Shell. Published in Proceedings of CADUI 2002, May 2002. 12 pages, with 21 references. "There has been a widespread emergence of computing devices in the past few years that go beyond the capabilities of traditional desktop computers. However, users want to use the same kinds of applications and access the same data and information on these appliances that they can access on their desktop computers. The user interfaces for these platforms go beyond the traditional interaction metaphors. It is a challenge to build User Interfaces (UIs) for these devices of differing capabilities that allow the end users to perform the same kinds of tasks. The User Interface Markup Language (UIML) is an XML-based language that allows the canonical description of UIs for different platforms. We describe the key aspects of our approach that makes UIML successful in building multi-platform UIs, namely the division in the representation of a UI, the use of a generic vocabulary, a process of transformations and an integrated development environment specifically designed for transformation-based UI development. Finally we describe the initial details of a multi-step usability engineering process for building multiplatform UI using UIML... [We show] some of our research in extending and utilizing UIML to generate multi-platform user interfaces. We are using a single language, UIML, to provide the multi-platform development support needed. This language is extended via the use of a logical model, alternate vocabularies and transformation algorithms. Our approach allows the developer to build a single specification for a family of devices. UIML and its associated tools transform this single representation to multiple platformspecific representations that can then be rendered in each device. We have presented our current research in extending UIML to allow building of interfaces for very-different platforms, such as VoiceXML, WML and desktop computers." See: (1) the 2002-10-29 news item "OASIS Members Propose TC for User Interface Markup Language (UIML)."; (2) "XML Markup Languages for User Interface Definition" [cache]

  • [October 30, 2002] "WS-I Release Profile for Building Web Services." By Paul Krill. In InfoWorld (October 29, 2002). "The Web Services Interoperability Organization (WS-I) on Tuesday announced availability of the WS-I Basic Profile Working Draft, which features specifications and guidelines for developing interoperable Web services. The Basic Profile consists of implementation guidelines recommending how a set of core Web services specifications, including SOAP 1.1, WSDL 1.1, UDDI 2.0, XML 1.0, and XML Schema, are to be used for developing interoperable Web services. The WS-I Basic Profile Working Group is seeking public feedback on the draft, with plans to release a final version in early 2003. 'The Basic Profile is the first deliverable from the WS-I and it's essentially a set of guidelines for people building Web services applications to follow to make their applications interoperable,' said Steven VanRoekel, director of Web services marketing at Microsoft, in Redmond, Wash., and a member of the WS-I marketing committee. WS-I is looking to follow up on work being done at standards bodies such as OASIS (Organization for the Advancement of Structured Information Standards) and W3C (World Wide Web Consortium) and bring Web services interoperability to fruition, VanRoekel said. 'We're looking to take work from standards bodies downstream,' he said. 'We coalesce [standards] into a way to build applications that are interoperable.' Component technologies are found within the scope of the Basic Profile for messaging, description, discovery, and security. Messaging is defined as the exchange of Web protocol elements, usually over a network, while description involves the enumeration of messages associated with a Web service and implementation details. Discovery includes metadata that enables advertisement of a Web service's capabilities, while security is intended to provide integrity, privacy, authentication, and authorization. The security element of the profile describes Secure HTTP, for example, but not the proposed WS-Security standard from OASIS, VanRoekel said. 'We're just not there yet. You have to solve the foundational issues first,' said VanRoekel..." See: (1) the text of the announcement, "WS-I Publishes Basic Profile Working Draft. Now Available for Public Comment."; (2) more detailed references in the news item of 2002-10-18: "Web Services Interoperability Organization Publishes Basic Profile Version 1.0."; (3) "Web Services Interoperability Organization (WS-I)." [PR source .DOC]

  • [October 29, 2002] "UDDI and WSIL for e-Science." By Rob Allan, Dharmesh Chohan, Xiao Dong Wang, Mark McKeown, John Colgrave, and Matthew Dovey. Technical Report from the CLRC e-Science Centre and Distributed Computing Programme. October 14, 2002. 19 pages. ['A UK e-Science UDDI registry is being developed to integrate e-Science projects and Virtual Organisations into a Web and Grid services framework. This paper provides the technical background to this project, which complements other sources of information such as the NeSC project database.'] "In this paper we describe how an private UK e-Science UDDI registry or Web Services Inspection document hosted by the Grid Support Centre might be used to register information about e-Science Virtual Organisations and to enable inter-working between them by exposing their contacts and service points. We propose using UDDI and WSIL to provide APIs for information about UK e-Science projects and also show how individual projects might use the same technology. Examples of the latter are the CLRC Integrated e-Science Environment project (IeSE) and EPSRC's MyGrid. These show how UDDI could be used within a single e-Science project for discovery of its own businessEntities and services by high-level components such as applications and portals. We believe that by providing interfaces to e-Science projects using (proposed) Web services standards, such as UDDI and WSIL, it will facilitate commercial uptake. A partly moderated top-level service will build confidence, allow for testing but still provide the capability to register with the worldwide Universal Business Registry via the publisherAssertion capability as projects become more mature and wish to expose their services to international partners. It nevertheless remains to be seen how the proposed services could be used to enable electronic contract negotiation via the so-called 'tModels'. Finally, appendices describe UDDI and WSIL implementations and a proposed architecture for accessing Web services through a firewall using a proxy service. Implementations of this architecture will show if the performance is acceptable for a variety of purposes... We outline how UDDI might be used as a programmatic Web services information directory for e-Science projects and for services within a single project providing its community with multiple 'views' of sub-projects. UDDI will probably need to be supplemented with a 'top-level' WSIL document accessible from the UK Grid Support Centre Web site. The UDDI and WS-Inspection sources can be cross referenced. A private UDDI for the projects of the UK e-Science Programme offers the chance to test the publication of Web services and provide some degree of assurance that the published services will be acceptable to the wider community. Services may also be published to the Universal Business Registry and the publisherAssertion mechanism of UDDI v3.0 can be used to guarantee that they do belong to the programme..." See: (1) "Universal Description, Discovery, and Integration (UDDI)"; (2) "Web Services Inspection Language (WSIL)." [PDF format, cache]

  • [October 29, 2002] "An Introduction to WSIL." By Timothy Appnel. From O'Reilly OnJava.com (October 16, 2002). "The Web Service Inspection Language (WSIL) is an XML document format to facilitate the discovery and aggregation of Web service descriptions in a simple and extensible fashion. While similar in scope to the Universal Description Discovery and Integration (UDDI) specification, WSIL is a complementary, rather than a competitive, model to service discovery. Since its release, UDDI has been widely criticized for its implementation, and its relevance questioned repeatedly at this stage of Web services architecture development. Created by a group of IBM and Microsoft engineers and released in November 2001, WSIL is significant because of its simpler document-based approach, which is more lightweight and straightforward and better leverages existing Web architectures. This approach could lead to this specification's rise to prominence. In this article, I will cover the core elements of the WSIL specification, including how WSIL inspection documents are located. Additionally, I will take a cursory look at the specification's extensibility with service descriptions such as WSDL, and point out some problematic issues in the specification. First, let's return to the complementary and contrasting approaches UDDI and WSIL employ in service discovery. UDDI implements service discovery using a centralized model of one or more repositories containing information on multiple business entities and the services they provide. You could compare UDDI to the Yellow Pages in your phone book, where multiple businesses are grouped and listed with a description of the goods or services they offer and how to contact them. The specification provides a high level of functionality through Simple Object Access Protocol (SOAP) by requiring specifically an infrastructure to be deployed with substantial overhead and costs associated to its use... WSIL approaches service discovery in a decentralized fashion, where service description information can be distributed to any location using a simple extensible XML document format. Unlike UDDI, it does not concern itself with business entity information, nor does it specify a particular service description format. WSIL works under the assumption that you are already familiar with the service provider, and relies on other service description mechanisms such as the Web Services Description Language (WSDL)....WSIL is a simple, lightweight mechanism for Web service discovery that complements, rather then competes with, UDDI. WSIL's model is a decentralized one that is document-based, and leverages the existing Web infrastructure already in place. While UDDI can be seen as a phone book, WSIL is more like a business card. Looking at it in another way, WSIL is like the RSS of Web services..." See: "Web Services Inspection Language (WSIL)."

  • [October 29, 2002] "Streamline Your Code and Simplify Localization Using an XML-Based GUI Language Parser." By Paul DiLascia. In MSDN Magazine (October 29, 2002). .NET GUI Bliss. ['MotLib.NET is a library of C# classes that bring many MFC goodies to Windows.Forms. MotLib.NET implements the amazing MGL ("miggle", mot's gui language), a baby XUL-like GUI definition langauge that lets you code your UI in XML.'] "While Windows Forms in .NET has lots of cool features, if you're used to MFC, there are a couple of things you'll find missing, like doc/view, command routing, and UI update. The .NET answer to this is a code generator that writes new code for every single element. But there's a better way. In this article, Paul DiLascia shows how to develop an XML-based GUI language parser for .NET that lets you code resources, menus, toolbars, and status bars in XML instead of with procedural code. He also shows how a user interface based on XML can easily be localized using standard .NET techniques, and introduces his very own library, MotLib.NET, with lots of GUI goodies for your programming pleasure... when it comes to GUI, .NET has a weak spot. Windows Forms has all the essentials, and many welcome improvements like anchor points, docking, and automatic recreation when you change window styles... In the pages that follow, I'll show you how to build a system that closes the GUI gap in Windows Forms. I'll show you how to have your Windows Forms and favorite MFC goodies, too. You'll learn to localize with ease and flair by coding your GUI in XML. This article covers resources, commands, menus and toolbars; a future article will deal with forms... I knew I wanted something like RC files that would let me express menu and other UI definitions in a separate and therefore more easily translatable file, using some kind of special language. What better language to use than XML? In fact, such a language already exists: XUL ('zool'), the XML User-interface Language. XUL is a dialect of XML for describing user interfaces. XUL was developed by the Java language folks for Mozilla (the Netscape engine). XUL is quite extensive, with commands for menus, toolbars, buttons, edit controls, and all sorts of widgets, well beyond the scope of this article. But it's not hard to write a mini-XUL that supports only the widgets you need. XUL -- or something like it -- is just the ticket to GUI greatness! In the end, I wrote a new library, MotLib.NET, with classes to parse GUI definitions in my own invented XML dialect, MGL (pronounced "miggle"). MGL stands for Mot's GUI Language. Having already named my earlier MFC class library PixieLib to emphasize its smallness, I decided to continue the tradition of cutesy pet names to underline my emphasis on small, tight code... MGL is a dialect of XML and follows the usual XML semantics, with case-sensitive elements and attributes that are, by convention, lowercase. Commands are represented by Command objects, which have properties such as Id, Prompt, Tip, and Tag. Command IDs are strings, not integers... MGL elements generally mirror .NET classes. For example, whereas XUL has <popup> to create submenus, MGL uses <menuitem> the same way as .NET. To build a submenu, create menuitems within menuitems... MGL does <toolbar> and <statusbar> too... " General references: "XML Markup Languages for User Interface Definition."

  • [October 28, 2002] "Sun Unveils Upgraded App Server." By Paul Krill. In InfoWorld (October 28, 2002). "Sun Microsystems on Monday will release two editions of the Sun ONE Application Server, featuring integration with the Sun ONE Studio for Java Enterprise Edition Web services tools. Sun ONE Application Server 7 features an implementation of J2EE 1.3 and Java Web Services Developer Pack (WSDP) to provide standard implementations of Web services standards including WSDL, SOAP, ebXML, and UDDI... Version 7 is integrated with Sun ONE Studio for Java 4.1, an updated toolset that includes support for WSDP. This integration, plus pre-built components provided in the Sun ONE Application Framework 2.0, helps streamline development of Java Web services and increase developer productivity, according to Sun. 'This is actually the first application server that's both J2EE 1.3-compliant and also supports the complete Java WSDP,' said Rich Schultz, group product marketing manager for Sun ONE Java Web Services, in Santa Clara, Calif. The two editions being released include the core Platform Edition and Standard Edition. The Platform Edition is available as a free download on Solaris and Windows platforms. This edition is expected to be integrated into the Solaris 9 operating environment in January. Versions for SunLinux, HP-UX, and AIX are expected to be available in 60 to 90 days. Maintenance and support are available for an additional cost..." See other references in the announcement: "Sun Microsystems Radically Changes Application Server Market With Release of Sun One Application Server 7. Free Platform Edition With New Integrated Java Tools Makes Creating Web Services Faster and More Economical. Sun ONE Application Server 7 is More Than 50 Percent Faster than IBM WebSphere in JSPs, Servlets, and JDBC."

  • [October 28, 2002] "Sun One Application Server 7 Debuts." By Clint Boulton. In InternetNews.com (October 28, 2002). "Sun Microsystems Monday launched a refresher of its application server platform geared toward leveraging the Java language into its product line for more fluid Web services deployment. The Palo Alto, Calif. concern said Sun One Application Server 7 is J2EE 1.3 compliant and supports the Java Web Services Developer Pack to provide tools and platform for Web services such as SOAP (define) and WSDL (define). Rick Schultz, Grid Marketing Manager for Sun One Java Web Services, told internetnews.com that the release marks a new modular approach, complete with new codebase and architecture for Sun's application server development. Sun One Application Server 7 will unfold in three versions, including a platform and standard edition, which are available immediately, and an enterprise edition, slated for a March 2003 release. The Sun One Application Server 7 Platform edition is free and will be tightly integrated into the company's flagship operating system offering Solaris 9 in January, according to Schultz. Its architecture consists of J2EE 1.3, Web Services and an HTTP Server. This package is available in Solaris and Windows today; Linux within 60 days; and HP-UX and AIX in 90 days. The standard addition, complete with remote managing capabilities and multi-tier deployment, will cost $2,000 per CPU. As the company's fullest Web services software package yet, the enterprise edition will feature higher functionality through clustering features, courtesy of the company's acquisition of Clustra Systems last March, as well as Web tier load balancing and advanced session replication..." See details in the announcement.

  • [October 28, 2002] "XML Watch: Exploring Alternative Syntaxes for XML. Weighing the Pros and Cons." By Edd Dumbill (Editor and publisher, xmlhack.com). From IBM developerWorks, XML zone. October 2002. ['XML's syntax has brought many benefits due to its interoperability, yet it can be tiresome to author XML documents. Edd Dumbill examines a range of alternative syntaxes for XML, and discusses their benefits and drawbacks.'] "One of the paradoxes of XML is that despite having a heritage from the document-creation community, it can often be remarkably frustrating to author by hand. The extra typing required to open and close tags and escape special characters not only wastes time, but introduces more possibility for error. If you don't want to buy an editor to help you get around this -- and many people don't, for various reasons including taste, principle, and the sheer intractibility of creating a general-purpose XML editor -- then you're stuck editing in longhand. SGML, the document-oriented ancestor of XML, had a way round this. SGML included ways of adding shortcuts to reduce the amount of tagging required, and could even completely redefine document syntax. However, when XML was created, this functionality was omitted to simplify the language and increase interoperability. Over time, though, many of the features in SGML have been reimplemented for XML -- either by standards organizations, or just by community efforts. This is somewhat ironic as, in the early days of XML, its proponents took great delight in proclaiming the simplicity of XML over SGML. Now, with all of XML's bolt-ons, the complexities of the two technologies are at least comparable! The purpose of this article is to survey some of the most popular alternative syntaxes developed for XML, and highlight their areas of usefulness. I will not attempt to list them all, as many people have already made endeavours in this area. Alternative syntaxes have been created for various reasons: to save effort, to mimic favorite environments, to better illustrate the underlying data model, or to work better with existing tools. In answer to the obvious question about decreasing interoperability through other syntaxes, note that none of these syntaxes purport to be an exchange syntax -- that is still left to the XML 1.0 syntax... The main motivation for creating non-XML syntaxes is the difficulty inherent in authoring XML. As Scott Sweeney noted, even the best commercial XML editors require a degree of point and clicking that gets in the way of rapid, free-form content creation. At the end of the day, the contracted general XML syntaxes such as SOX and SLiP have very little to differentiate between them: Their main benefit seems to be in the ability to omit closing tags. One downside to such contracted syntaxes is in the loss of interoperability and future-proofing. Most of the efforts come from single third-party sources. It's not entirely clear what support there is for diverse character encodings, as well as the less frequently used parts of XML such as processing instructions. Also in most cases there is only one tool originator in place, so the ideas may well just die out..."

  • [October 28, 2002] "XML and Database Mapping in .NET." By Niel Bornstein. From XML.com October 23, 2002. ['Niel Bornstein looks at XML database binding in ADO.NET. Continues the series approaching Microsoft .NET's XML APIs from a Java programmer's perspective.'] "The designers of the .NET runtime put a lot of thought into the issue of binding XML to a database. The path from XML to database and back again starts with ADO.NET, .NET's database persistence layer, which can be thought of as analogous to Java's JDBC. The System.Data.SqlClient namespace has all the classes you'd expect to see... There is an entire suite of classes in the namespace System.Data.OleDb, whose names start with OleDb, which work with any OLE DB provider. If you've used JDBC, you already know basically how to use these classes. Along with these classes, ADO.NET also introduces the DataSet. A DataSet instance represents an entire database, including the ability to track changes made to individual data elements and to persist them to the underlying database when necessary. The DataSet can maintain its state while disconnected from the actual database. The DataSet can be used to build a data model without writing any SQL. It can also be used to persist the data model to an XML Schema or to read an existing XML Schema and dynamically build the data model. The DataSet can also read and write its data to XML..."

  • [October 28, 2002] "Whither Web Services?" By Edd Dumbill. From XML.com October 23, 2002. ['A view on the changing nature of web services. The mainstream tech media, once so keen to hype up web services, is now starting to take a more skeptical view. Does this mean that web services are on the way down? Not at all... things are just getting started.'] "...the concept of web services is basically what got a lot of us excited about XML in the first place, and that the state of web services as a whole has never been better. Let me expand on this a little. First, attempts to take control of a top-down architecture have failed. Despite numerous proposals, there's been no global 'eureka!' moment. You can still deploy a web service and have it implemented pretty much as you like. There's no license fee to access a web service superhighway. Second, since the world hasn't changed overnight, developers are able to investigate alternatives. While I'm not in any particular camp in the protocols debate, it has been heartening that plain old XML & HTTP (aka REST) and BEEP have had a hearing alongside the bigger initiatives, as well as grass roots inventions like Jabber. Heterogeneity rules, for the moment at least, giving us hope of better technical solutions in the long run. Third, and most excitingly, any developer that wants to create a web service, invent an ad hoc application protocol, and do useful work can do so -- and can share this with others by making their protocol open. All of these points apply equally well to XML: you can choose from different architectural styles, different schema languages, and invent your own schemas to your heart's content. Apart from the immediacy of document exchange, it's hard to draw the line between creating an XML vocabulary and a web service..."

  • [October 28, 2002] "XML 1.1: Here We Go Again." By Kendall Grant Clark. From XML.com October 23, 2002. ['Recently, the W3C released XML 1.1 as a Candidate Recommendation: this is the stage of a specification where testing and trial implementation is invited. As might be expected, this cause no small stir in the XML community.'] In this week's XML-Deviant column I report on developments in XML, the base layer of the Web's architecture. XML 1.1 Candidate Recommendation: We find XML at or near the bottom of the stack. Stability in the base is crucial to any sound architectural design. That is, turbulence in the base is harmful, not just to a design but to its implementation, too. We should expect, then, XML to change very slowly, if or when it changes at all. The ideal outcome for 1.0 version of the XML specification was for it to have been left untouched, forgotten in some dusty corner of the W3C for a good five years. We almost made it. The fifth anniversary of the first edition of XML 1.0 is next February. Of course the XML Namespaces specification should count as a change to XML 1.0. It being promulgated as a separate document didn't minimize significantly the turbulence caused by its adoption. And the only sort of change namespaces can reasonably be thought to be is a major one. So, while it's clear that XML 1.0 has flaws, a good argument can be made that, for the sake of stability and maturity at higher levels, XML 1.0 should be left alone for, say, another three years... So in this case, as in many technological cases, making a decision is a matter of balancing costs and benefits. There are at least three questions to answer -- first, should XML 1.0 be revised now; second, having been revised, should anyone implement it; third, having been implemented, should anyone adopt it? There isn't a single logic covering each of these questions. The cost-benefit analysis is significantly different depending on whether you're a specification writer, a parser or other tool supplier, or an XML end user... In some sense the success or failure of XML 1.1 rests with the middle group, the vendors and tool makers. As Michael Kay said, 'A lot depends on the major parsers, though. If they decide their users aren't interested in XML 1.1 so they're not going to rush to implement it, then obviously XML 1.1 is dead. If they decide they're going to implement it whether users want it or not, then people will gradually adopt it without really noticing they have done so'..." See "W3C Publishes Extensible Markup Language (XML) 1.1 as a Candidate Recommendation."

  • [October 28, 2002] "Captured in XML." By Jon Udell. In InfoWorld (October 25, 2002). "XML documents, encapsulated in SOAP messages, are the packets of the business Web. Standards efforts under way focus on how to create, transform, interpret, sign, and encrypt these packets as they flow among communicating applications and services. Because XML documents will often model the real business documents that support business processes, such as purchase orders, it's clear that people will need to be able to write them, too. Sadly, the tools that capture nearly all of our keystrokes -- e-mail, word processors, Web pages -- can't compose valid XML. Solving this problem is as critical as any challenge facing Web services today. Ideally every operating system would offer a standard XML editing component, embeddable in Web pages and GUI applications. Wired to a DTD (Document Type Definition) or XML Schema, this component would allow users to interactively create or modify valid instances of the DTD or schema. Microsoft is taking several steps in this direction. In Office 11, both Word and Excel (but not Outlook) can display and edit schema-defined XML. A recent demo of these features left us with questions that only the beta code (expected shortly) can answer, but the basic strategy is laudable in one way and flawed in another... Ektron, has made a business out of improving [MS DHTML edit control]. Ektron's eWebEditPro enables the Microsoft control to embed in Netscape as well as IE; wraps a JavaScript API around the control so that developers can add or subtract features to customize it for specific applications; and can reduce the awful HTML generated by the control to nice, clean XHTML (Extensible HTML). This format, which combines the familiarity of HTML with the mechanical regularity of XML, is a great way to simplify the management of semistructured and unstructured content... Separately, and not yet integrated with CMS200, Ektron offers eWebEditPro+XML. In this version of the edit control, XML content is displayed in nested frames (defined in an XML configuration file), and made available for structured editing. The XML content is associated with a DTD or schema whose constraints are expressed in the UI. The available choices for the "city" pick-list, for example, are controlled by an XML Schema simpleType that enumerates them. This is a tour de force that pushes the edit control far beyond its intended use. Predictably, the results are not always seamless... A co-chair of the W3C XForms working group, Sebastian Schnitzenbaumer is CEO of a company that offers a transitional product that tracks this emerging standard. Mozquito Technologies' Web Access 2.0 makes XForms-like technology deliverable in today's browsers. An XForms form, for example, uses XML Schema data types to constrain the values permitted in a form. As rendered by Web Access 2.0, the form is defined in HTML, uses generated JavaScript to handle validation, and returns valid XML... We like the concepts that Ektron and Mozquito are developing. XML data capture is too important to be siloed within applications such as Word and Excel. It's a capability that needs to be readily accessible to developers and deployable everywhere."

  • [October 28, 2002] "Authoring Challenges for Device Independence." Edited by Rhys Lewis (Volantis Systems). W3C Working Draft 18-October-2002. Latest version URL: http://www.w3.org/TR/acdi/. First public Working Draft. Produced by members of the W3C Device Independence Working Group (DIWG), part of the Interaction Domain. "The document provides a discussion of several challenges that web site authors commonly face when making content and applications available to users with devices of various capabilities The document examines the effects on authors and the implications for authoring techniques that assist in the preparation of sites that can support a wide variety of devices." See also the DIWG charter and mailing list archives.

  • [October 28, 2002] "Migratable User Interface Descriptions in Component-Based Development." By Kris Luyten, Chris Vandervelpen, and Karin Coninx (Expertise Centre for Digital Media, Limburgs Universitair Centrum, Belgium). Presented at the Ninth International Workshop on Design, Specification, and Verification of Interactive Systems (DSV-IS 2002), Rostock, Germany, June 12-14, 2002. 15 pages, with 14 references. "In this paper we describe how a component-based approach can be combined with a user interface (UI) description language to get more exible and adaptable UIs for embedded systems and mobile computing devices. We envision a new approach for building adaptable user interfaces for embedded systems, which can migrate from one device to another. Adaptability to the device constraints is especially important for adding reusability and extensibility to UIs for embedded systems: this way they are ready to keep pace with new technologies... To describe a UI on a suffciently abstract level the Extensible Markup Language (XML) is used. Listing 1.1 provides an example of how a UI can be described in XML. A list of advantages is given in ['An XML-Based Runtime User Interface Description Language for Mobile Computing Devices,' Proceedings of the Eight Workshop of Design, Specification and Verification of Interactive Systems, June 2001]. One of the major advantages is that XML does not force any level of abstraction, so this level can be adapted to the requirements of the situation. Note that an XML document can be presented as a tree which turns out to be a great advantage in our approach. There are other approaches for describing User Interfaces, but we believe that an XML-based description offers the best solution in our component-based approach because of it heavily relies on hierarchical structures. [XML syntax goals: platform independent, declarative, consistency, constraints, extensible, reusable, transformations]... the component-oriented approach suggested in this paper has several advantages for developers of embedded systems and mobile computing devices in particular. It is (1) Flexible: changing the UI can be done by another renderer component or letting components provide another UI description; (2) Reusable: providing a high level description of the UI related to the functionality a component offers, allows easier reusability of previously designed UIs in contrast with hard-coded UIs; (3) Adaptable: by abstracting the UI, device constraints can be taken into account when rendering the concrete UI..." See also the presentation slides. General references: "XML Markup Languages for User Interface Definition" [cache]

  • [October 28, 2002] "Specifying User Interfaces for Runtime Modal Independent Migration." By Kris Luyten, Tom Van Laerhoven, Karin Coninx, and Frank Van Reeth. Paper presented at CADUI 2002, Fourth International Conference on Computer-Aided Design of User Interfaces, Université de Valenciennes, France, May 15-17, 2002. 12 pages, with 18 references. "In this paper an approach will be presented which provides a uniform interface to such services, without any dependence on modality, platform or programming language. Through the usage of general user interface descriptions, presented in XML, and converted using XSLT, a uniform framework is presented for runtime migration of user interfaces. As a consequence, future services will become easily extensible for all kinds of devices and modalities. An implementation serving as a proof of concept, a runtime conversion of a joystick in a 3D virtual environment into a 2D dialog-based user interface, is developed... At runtime a user interface description is generated by means of XML as a description language. The XML description provides an abstraction of the user interface using Abstract Interaction Objects (AIO) which can be mapped onto Concrete Interaction Objects (CIO)... Our work does not aim to extract a Task-based description of the User Interface but merely tries to abstract the 'contents' of the user interface, not the presentation. Although we acknowledge the idea of splitting the interface model into a User-task model, a domain model, a user model, a presentation model and a dialog model for getting a better mapping of AIOs on CIOs, we do not consider it to be useful when focusing on runtime migration of existing user interfaces. A working user interface is 'serialized' into an XML description at runtime and this description can be transported to another system using for example the Internet or an infrared communication protocol. Once arrived at the target system the XML document can be parsed and converted in a working user interface ('deserialized'). An advantage of this approach can be found in the abstraction of the original user interface that the XML document provides. While parsing the XML document that contains an abstraction of the user interface, the renderer of the target platform is free to choose other ways to present the same functionality in the user interface... As an abstraction of the user interface we have defined a DTD. This DTD restricts the possible elements and ordering used in the XML document. Our current DTD is inspired by [Müller/Forbrig/Cap 'Model-Based User Interface Design Using Markup Concepts'] and enriched by a subset of interactors... We are planning to replace it with an equivalent XML Schema as soon as there are more mature tools available to use this... The transformation of the original user interface description into a higher level description is realized using the XSLT. Every system or user interface toolkit has to define a conversion providing the appropriate XSLT. Once the higher level description is produced a system specific XML document for the target system has to be produced. For every system a XSLT is defined which maps the AIOs defined in the abstract user interface description to CIOs for that particular system..." See also the presentation slides. General references: "XML Markup Languages for User Interface Definition" [cache]

  • [October 26, 2002] "Jabber, WebMethods Tackle Enteprise Integration." By Richard Karpinski. In InternetWeek (October 21, 2002). "Integration specialist WebMethods on Wednesday said it was partnering with Jabber Inc., best known for its instant messaging technology, to develop real-time event notification services for the financial services industry. The deal highlights that Jabber's XML-based platform, though perhaps best known as an open-source and commercial alternative to public instant messaging services, can also act as a platform for any app that needs real-time messaging... WebMethods will provide the integration technology to tie multiple applications together to enable STP; Jabber's messaging technology will deliver real-time event notification to give customers a window into the process. The Jabber Communications Platform offers an XML-based platform for instant messaging and presence-enabled applications..." See: (1) the announcement: "Jabber and webMethods Form Strategic Alliance. webMethods and Jabber Team to Provide Real-Time Information to Financial Services Companies Using webMethods' Market-Leading Integration Platform."; (2) general references in "Jabber XML Protocol."

  • [October 26, 2002] "XSL-FO Inline Elements." By Dave Pawson. Sample Chapter 6 (pages 112-125) in XSL-FO: Making XML Look Good in Print (Sebastopol, CA: O'Reilly, August 2002). "[Extensible Style Language-Formatting Objects, or XSL-FO, is a set of tools developers and web designers use to describe page printouts of their XML (including XHTML) documents. If you need to produce high quality printed material from your XML documents, then XSL-FO provides the bridge.] In this chapter, we will cover what is perhaps the simplest area of XSL-FO: styling the inline content. This is analogous to the word processor's application of bold or italics to particular words. Inline content can be defined as content that, when formatted, does not extend beyond the formatted line extent, i.e., it does not wrap into a new line. Typical source content that may need marking for fo:inline might include content that needs to be emphasized for a specific purpose, such as emphasis, computer commands, instructions, and cross-references. The formatted output might be italicized, underlined, boldface, or hyperlinked. Other visual forms of emphasis include font changes and nontext output, such as inline graphics, horizontal lines, or dot leaders. These are all possible within fo:inline..." Note also: (1) the document "XSL Frequently Asked Questions" maintained by Dave Pawson; (2) "Printing from XML: An Introduction to XSL-FO," by Dave Pawson. General references in "Extensible Stylesheet Language (XSL/XSLT)." [cache]

  • [October 26, 2002] "EMC Adopts SMI Standard." By Scott Tyler Shafer. In InfoWorld (October 25, 2002). "Hoping to keep momentum alive around its WideSky initiative, EMC has announced it will incorporate open-standards specifications into its developer suite by early next year. The announcement this week follows the development of an industry standards initiative known as the Storage Networking Industry Association (SNIA) Storage Management Initiative (SMI), which was previously referred to as the CIM (Common Information Model)/WBEM (Web-Based Enterprise Model)/Bluefin specification. The SMI standard addresses a number of critical enterprise storage concerns in that it allows storage management software platforms, or clients, to discover, collect data from, and manage multivendor devices of all types, called providers, in a SAN. Hopkinton, Mass.-based EMC's decision to support open standards in its WideSky platform gives it the capability of collecting data from the SMI-compliant storage devices that are likely to come to market early next year from most vendors after the SMI standard is ratified, which is estimated to happen in March or April. ['the new WideSky Developers Suite now supports two new interfaces: Java Native Interface (JNI) and Extensible Markup Language (XML) along with current support of a C interface. This allows developers to create applications in their choice of industry-standard languages and reduce their time-to-market in introducing new innovative software products'] In the same vein, Burlington, Mass.-based startup AppIQ this week released its own SDK that assists hardware vendors to speed the development of SMI-complaint products... The activity comes as the semi-annual Storage Networking World conference in Orlando, Fla., kicks off next week. Vendors, including Hewlett-Packard, EMC, and AppIQ, will demonstrate a SAN managed through a single management platform built on the SMI standard..." References: (1) "Storage Vendors Announce CIM Product Rollout and Joint Interoperability Testing"; (2) SNIA Storage Management Initiative; (3) "DMTF Common Information Model (CIM)."

  • [October 26, 2002] "Web Services Key To Groove 2.5." By Barbara Darrow. In CRN (October 25, 2002). "Groove Networks is working on the next version of its collaborative software, adding support for important Web services protocols and integration with Microsoft SharePoint Team Services... Groove 2.5, slated to ship by year's end, will add support for SOAP, WSDL and UDDI, cornerstone Web services protocols. Groove already supports XML. The company, founded by Lotus Notes guru Ray Ozzie, made its name by offering an easy-to-use way for people to chat in near-realtime and share documents and images privately and securely. With version 2.5, the company is bringing Web services into Groove and allowing interoperability, said Dana Gardner, an analyst at The Aberdeen Group..."

  • [October 26, 2002] "Liberty, WS-Security Uniting Over SAML Standards." By Vance McCarthy. From Integration Developer News. October 21, 2002. Case Study. "Last month, the Liberty Alliance Project elected a new president Michael Barrett, vice president for Internet strategy at American Express. Since coming to office, Barrett has left little doubt that he will push those vendors sparring over identity and security standards -- notably Sun, IBM and Microsoft -- to reach an agreement on interoperability. So far, the 'peace talks' between Liberty and Microsoft Passport seem to be going well, thanks in large part to all-party discussions on security taking place under the OASIS (Organization for the Advancement of Structured Information Standards) umbrella, and some XML-based security brokering technology being specified inside OASIS called SAML (Security Assertion Markup Language). 'It's pretty obvious that we'll use SAML as a glue between different identity approaches [such as Liberty and Passport], the SAML technical committee co-chair for OASIS Jeff Hodges told Integration Developer News. 'And while SAML is not an authentication technology in and of itself, SAML can be used as a tool to glue together disparate authentication domains.' For its part, Liberty is built on SAML, but does not define any authentication mechanisms. SAML is a framework and one needs to profile it to put into context to make use of it. Further, the Java Community Process (JCP) has a proposal to natively support SAML (JSR 155) for use in J2EE... Of good news to developers worried about interoperability issues, SAML is also being endorsed by Microsoft execs. 'Members of the OASIS security committee wants to see all our work reconciled, and we want to see SAML token support in WS-Security.' Adam Sohn, a product manager for Microsoft .NET platform strategy group told IDN in an interview this summer. Sohn added that WS-Security's decision to support SAML (and Liberty) will not prompt WS-Security to 'downplay' plan to support a variety of security mechanisms already at work within the enterprise, including PKI, Kerberos and even SSL. WS-Security will look at Liberty and SAML as just another credential type, and we expect to have support in WS-Security this year' Sohn added... 'There are a lot of touching points across Liberty, SAML and WS-Security, and it's hard to look at a crystal ball to see exactly what will happen. But we are starting to see some convergences and rapprochement between all these groups, ' Slava Kavsan, Chief Technologist at RSA Security, and chairman of the Liberty Alliance's Trust and Security Group told IDN. Notably, RSA has been a key figure in pushing compatibility among Sun, Microsoft and IBM approaches. 'There is good news. First, WS-Security will mention SAML is its next draft.' Kavsan looks at the interoperability issues on identity as similar to other compatibility questions that exist on a number of web services fronts between IBM, Microsoft and Sun. 'We're still in a basic architectural world of the Microsoft client needing to talk to a Java or Sun server,' he said. 'Even though Liberty is working on its own browser-based client spec, Liberty needs to and intends to support Microsoft's client base'..." See: (1) "Web Services Security Specification (WS-Security)"; (2) "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [October 26, 2002] "Tip: Traversing an XML Document with a TreeWalker. Navigate your DOM tree while maintaining parental relationships." By Nicholas Chase (President, Chase and Chase, Inc). From IBM developerWorks, XML zone. October 2002. ['XML's Document Object Model provides objects and methods that enable a developer to navigate a document's tree, but typically the process involves NodeLists and recursive methods that make it easy to get lost within the structure. The DOM Level 2 Traversal module provides a new object, the TreeWalker, which simplifies this process and makes navigation more reliable. This tip demonstrates the process of determining whether a TreeWalker is available and how to use it to extract information from a document. This tip uses JAXP, but the sample application will also work with Xerces-Java 2 and the concepts are applicable for any XML parser environment.'] "The Document Object Model (DOM) provides interfaces such as Node, which includes methods such as getFirstChild() and getNextSibling(). Along with methods such as getChildNodes(), these methods provide a way to navigate an XML document, typically using recursion to analyze each set of children. A TreeWalker does much the same thing, but in a more organized way. Rather than recursing through a method, a TreeWalker actually navigates the structure, using methods such as nextNode(). Each time it moves, the currentNode property changes to the new node, so the TreeWalker always knows where it is within the document..." See also Document Object Model (DOM) Level 2 Traversal and Range Specification Version 1.0 (W3C Recommendation 13-November-2000).

  • [October 26, 2002] "Adding Custom Functions to XPath." By Prajakta Joshi (Microsoft Corporation). MSDN Library. October 8, 2002. ['Guest author Prajakta Joshi discusses how to create custom functions for XPath using System.Xml APIs from the .NET Framework SDK. Topics include adding extension functions to XPath 1.0, a look forward to XPath 2.0, and using extension functions in XSLT.'] "Requests for extension functions are a commonly discussed topic on the XML and XSL public newsgroups. The motivation to write this article arose from observing a large number of user posts on this topic. An XPath expression in XPath 1.0 can return one of four basic XPath data types: String, Number, Boolean, Node-set. XSLT variables introduce an additional type into the expression language, result tree fragment. The core function library in XPath and a few XSLT specific additional functions offers basic facilities for manipulating XPath data types... As W3C XML Schema data types become more integrated with XPath 2.0, the Xquery 1.0 and XPath 2.0 functions and operators will offer a richer function library to XML developers than what currently exists in XPath 1.0. This does not completely eliminate the need for user-defined functions in XPath 2.0. Mechanisms for extensions will be indispensable for powerful and extensible query languages for XML... "

  • [October 25, 2002] "Adobe Fires Server Barrage." By [Seybold Staff.] In The Bulletin: Seybold News and Views On Electronic Publishing Volume 8, Number 4 (October 23, 2002). "This week Adobe unleashed a torrent of both new and re-branded products that strengthen the company's commitment to introduce server-based products for large organizations. Five server products -- two new and three acquired from Accelio -- join Adobe Graphics Server in creating a suite of server products that Adobe will sell through OEM, indirect and direct sales channels. In addition, Adobe announced an unusual point release of Acrobat Reader, a special update aimed primarily at review-and-approval processes in large organizations... The new Document Server is an amalgamation of Adobe's Graphic Server, Acrobat Distiller, parts of the FrameMaker engine, and a PDF form processor, all packaged together into a batch-formatting engine accessed through an API. Think of it as Distiller on steroids. It can manipulate PDF files (add and delete pages, add overlays, add or extract comments), assemble custom booklets from PDF pages (renumber, generate headers and footers) and extract or populate Acrobat forms. Using a new XSLFO-to-MIF converter and the FrameMaker processor, it can create PDF pages documents from XML source files (if they've been transformed to XSLFO). Using the embedded Adobe Graphics Server, it can grind up graphics (in PSD, EPS, SVG, TIFF, JPEG, GIF, PNG, or BMP formats) into PDF files... The second server product is a special version of the Document Server that applies 'usage rights' to individual PDF files or forms. These usage rights are not specific to individuals (as they are in the usage rights for e-books); rather, they are specific software features that the server turns on or off for that document, and recognized, for now, only by the new 5.1 version of Acrobat Reader... Adobe is targeting all of these products at government, financial and manufacturing sectors, where review-and-approval processes are complex and frequently involve people outside as well as inside the organization. For example, the U.S. Internal Revenue Service is testing the Reader Extensions server and Reader 5.1 with tax forms, which citizens could digitally sign and return to the government agency... Adobe's server volley countered last week's Microsoft Xdocs announcement that left some analysts worried that Adobe's PDF franchise was in jeopardy. Adobe has built the PDF/Acrobat franchise over a decade, and Xdocs, while competitive with Accelio's products, do not directly counter PDF's inherent fidelity to sophisticated print layouts. Adobe's message is that 'PDF is not just final form,' and its new emphasis on document processes and electronic forms applications will prove its case..." See other details in "Enhanced Adobe Document Servers Support XML-Based Workflow and Digital Signature Facilities."

  • [October 25, 2002] "Use Recursion Effectively in XSL. An Introduction to Recursion in XSL and Techniques for Optimizing Its Use." By Jared Jackson (Researcher, IBM). From IBM developerWorks, XML zone. October 2002. ['Using XSL transformations effectively and efficiently requires understanding how to use XSL as a functional language, and this means understanding recursion. This article introduces the key concepts of recursion and its particular use in XSL. Techniques for optimizing XML translations and avoiding errors while using recursion are also explained. Each concept and technique is accompanied with example code for the reader's reference.]' "Today's programming world is one dominated by imperative programming languages. All of the most popular languages -- Java technology, C (and its various flavors), Visual Basic, and others -- at a high conceptual level, work in basically the same way: You set some variables, call functions or operations that change those variables, and return the result. It is a powerful approach to programming, but it is certainly not the only one. Another breed of programming languages, while less familiar, are at least equally as powerful as their procedural counterparts. These languages are termed functional or declarative languages. Programs written in these languages may only declare variables once and can never change the value stored by a variable once it's declared. XSL as a programming language is both declarative and functional. This means that developers accustomed to writing in Java code or C and learning XSL often find themselves in foreign territory when using XSL's more advanced features. Due to the growing importance to both application and Web developers of XML and the related XSL technology, the effective use of XSL transformations cannot be ignored. Thus, it is increasingly important to learn how to program in the declarative fashion. This means becoming intimately familiar with recursion, both its basic usage and the methods for using it effectively and efficiently for the problems at hand... Once you've gained an understanding of the use of recursion, the declarative style of programming in XSL should no longer be an obstacle, but an efficient way of expanding the abilities of XML transformation. The only question remaining is which type of recursion is best for any individual situation... If your XSL engine recognizes tail recursion, then that is the best technique to use; if you cannot be assured that the transformation technology will recognize tail recursion, then the combination technique is generally preferred... Recursion can be a difficult concept to understand at first, but its usefulness and elegance becomes clearer with use..." [Sidebar: 'Is XSL a functional language?': "... Dimitre Novatchev, author and XSL developer, has written an article demonstrating how functions (templates in XSL) can indeed be passed as a data type using XML namespaces in clever manner. Whether this qualifies templates as first class data types is certainly debatable, but Novatchev's technique makes it clear that XSL can operate as a weakly-typed functional language..."] General references in "Extensible Stylesheet Language (XSL/XSLT)."

  • [October 24, 2002] "OASIS Tees Up Digital Signatures, Time Stamping." By Richard Karpinski. In InternetWeek (October 24, 2002). "The OASIS standards group Thursday formed a new technical committee to develop XML protocols for digital signatures and cryptographic time stamping in a Web services transaction. OASIS has become a font of XML standards, housing specs ranging from ebXML to WS-Security to SAML. Their latest work, somewhat surprisingly, appears to overlap work being done at the W3C on XML digital signatures. However, the W3C's work is on baseline XML standards such as XML Signatures and XML Encryption, both of which will ultimately play a role in the new OASIS standards, according to Karl Best, director of technical operations for OASIS. The OASIS digital signature standard will be more application-oriented and focus specifically on the use of the technology within a Web services context, the group said... Specifically, the work of the new OASIS Digital Signature Services Technical Committee will enable Web services to produce and verify digital signatures and provide techniques for proving that a given signature was created within its private key validity period..." See details in the news item "OASIS Members Propose Digital Signature Services Technical Committee."

  • [October 24, 2002] "OASIS Group Forms to Tackle Digital Signature Quagmire." By Brian Fonseca. In (October 21, 2002). "Organization for the Advancement of Structured Information Standards (OASIS) consortium members on Thursday set about to build standards to enable trusted and simplified digital signature processing for Web services environments by forming a new OASIS Digital Signature Services Technical Committee. The new OASIS group will work to develop open XML protocols to centrally control, pass along, and validate digital signatures and provide cryptographic time-stamping services as well as to ensure a presented signature was created within its private key validity timeframe, said Robert Zuccherato, chair of the newly formed OASIS committee. "We saw a need for standards dealing with signature processing. The W3C has done a lot of good work of defining syntax of XML signatures and a lot of management around that, but actual signature processing, creating or verifying signatures, pass and validation around that is a hard problem to solve," said Zuccherato, a research scientist at Dallas-based Entrust. He said the OASIS Digital Services Technical Committee hopes to have the complete work on the new standard within a year. The group's first meeting is scheduled for December 2, 2002 and application for membership is still open. Companies signed to the Web services digital signature crusade so far include Entrust, Tibco Software, VeriSign, WebMethods, Iona, and NIST, among others..." See: (1) the news item "OASIS Members Propose Digital Signature Services Technical Committee"; (2) the TC website; (3) "Digital Signatures."

  • [October 24, 2002] "Microsoft Promises Arbitrary XML in Office 11." By [Seybold Staff.] In The Bulletin: Seybold News and Views On Electronic Publishing Volume 8, Number 4 (October 23, 2002). "Microsoft has begun delivering advance test copies of the next version of Microsoft Office, code-named Office 11, and revealed that it includes XML-enabled versions of Microsoft Word and Excel. We expect the new version, which is due out in mid-2003, to have a profound impact on the authoring-technology aspects of the publishing community... The XML support is built into the base product, with color-coded tag markup and a pane for viewing the structure of the XML element tree. Users can turn tags on or off, and can trim the list of available elements to only those that are valid... Office 11 is separate from Xdocs, Microsoft's forthcoming XML-based forms application. According to a Microsoft spokesperson, the company has not decided whether Xdocs will be included as part of Office 11 or sold as a separate product, like Visio or MapPoint... for most of the professional publishing community, Office 11 will be a must-evaluate product in 2003. Ever since the demise of SGML Author (Microsoft's SGML add-in that fizzled in the mid-1990s), the publishing community has wished for native SGML, and subsequently XML, support within Word. Although other new collaborative features of Office 11 look compelling, native XML support is the one feature that, by itself, could make the upgrade worthwhile..." See: (1) the announcement: "Microsoft Releases First Beta of 'Office 11'. Next Version of Office to Connect People, Information and Business Processes."; (2) general references in "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas."

  • [October 24, 2002] "MS Office XML." By Tim Bray. Posting to XML-DEV. October 24, 2002. "I got an extended (hours-long) demo of ['Office 11'] Word & Excel & XDocs from JeanPa [Jean Paoli] and a product manager whose name I don't have handy, two or three months ago, so things may have changed but here's what I saw: Both Word and this new XDocs thing can edit arbitrary XML docs per the constraints of any old XSD schema. No DTD supprt. There are some of the usual XML editor goodies such as suggesting what elements can go here and picking attributes. They have pretty cool facilities for GUIfied schema customization. Neither of them can help much with mixed content, which has always separated the men from the boys in the *ML editing sweepstakes. I'm not sure that either of them are really being positioned as general-purpose XML content creation facilities up against Arbortext, Altova, and Corel. I'm not sure that market is big enough to interest MS anyhow. XDocs is (strictly my opinion) an attempt to build a desktop application constructor at a level that is a bit more declarative and open than VB, but richer and more interactive than a Web browser. I'm not really convinced yet - I think MS would agree there's still quite a bit of product management to do - but it does seem to be a pretty clever piece of software. I'm pretty sure it's safe to interpret the advent of XDocs as MSFT's declaration that they're not going to do anything with XForms. What actually turns my crank is that you can save word docs as XML and they have their own 'WordML' tag set that gets generated. I took a close look at this and it's pretty interesting. Very verbose - every word on the page gets its own markup. Suppose you have the word 'foo' in bold with single-underline, the WordML looks something like [...] When you get something like a Word table or floating text box the markup gets really severely dense and ugly, but I didn't see anything that seemed egregiously wrong, it's not pretending to do anything more than capture all the semantics that Word carries around inside, which are correspondingly severely dense and ugly. And HTML tables get pretty hideous too. Why did I like this? I didn't see anything that I couldn't pick apart straightforwardly with Perl, and if someone asked me to write a script to pull all the paragraphs out of a Word doc that contain the word 'foo' in bold, well you could do that. Which seems pretty important to me. The idea is that you can have a Word document with all that formatting and then you can mix that up pretty freely with your own schema stuff, and have validation, then you can save it as Word (your markup plus Word's) or as pure XML (discards Word's markup, leaving just yours). The old Corel WPerfect SGML editor used to be able to do this too. WordML and VML (for graphics) and your own schemas all get namespaces and they seem to use them sensibly. JeanPa even talked to me about using real HTTP URIs pointing at schemas.microsoft.com and having RDDL or equivalent there. This gave me an opportunity for sarcastic remarks about 'Imagine that, a URL on microsoft.com that stays stable for more than a week...' ... Anyhow, if they really do something like what they showed me, I'd call it a positive step..." See the Microsoft announcement and "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas."

  • [October 24, 2002] "Co-Inventor of XML Says Office 11 is 'A Huge Step Forward for Microsoft'." By Tim Bray and XML-J Industry Newsletter. In XML Journal (October 24, 2002). ['Now that the newly XML-enabled version of Microsoft Office, code-named "Office 11," is in its first official beta release, XML-J Industry Newsletter went straight to Tim Bray, co-inventor of eXtensible Markup Language, and asked for his exclusive views on this improvement... Bray did receive extended hands-on demos of the alpha and beta software, he says, which gave him the opportunity to test-drive and evaluate the suite.'] "When asked how XML-enabling will make a difference in MS Office, Bray quickly zeroes in on what in his view is the key differentiator in an XML-enabled Office suite vs the current one. 'The important thing,' he explains, 'is that Word and Excel (and of course the new XDocs thing) can export their data as XML without information loss. It seems Word can also edit arbitrary XML languages under the control of an XML Schema, but I'm actually more excited by the notion of Word files also being XML files.' So it's a breakthrough? Bray has no doubts whatsoever: 'The XML-enabling of Office was obviously a major investment and is a major achievement,' he declares, without hesitation. 'Built around an open, internationalized file format,' he continues, warming to his theme, 'Office 11 is going to be a huge step forward for management, independent software developers, and Microsoft'..." See also the announcement: "Microsoft Releases First Beta of 'Office 11'. Next Version of Office to Connect People, Information and Business Processes."

  • [October 24, 2002] "Sun Joins WS-I." By Paul Krill. In InfoWorld (October 24, 2002). "Sun Microsystems, which had been shut out of the Web Services Interoperability Organization (WS-I) board and thus had refused to join the organization, is joining WS-I as a contributing member and intends to run for the group's policy-making board in March 2003. Founded in February, WS-I is intended to be an open industry effort to promote Web services interoperability across platforms, applications, and programming languages... 'From Day 1, we've been supportive of WS-I and the work they're doing with interoperability. That hasn't been the issue. Our issue has been the governance model that has not allowed Sun to participate in WS-I,' said Ed Julson, Sun group marketing manager for Web services standards and technologies, in Santa Clara, Calif. 'I think we're a credible player in the industry. We have a long history of innovation in the standards arena and driving network computing,' Julson said. Sun intends to promote Web services standards at organizations such as the World Wide Web Consortium (W3C) and then align that work on converged standards with WS-I profiles, Julson said. 'Over the past six to eight months, Web services standards [have] completely exploded in terms of complexity so we have a situation now where we have this tremendous number of specifications, [which] in many cases are overlapping. We have to reduce this complexity and converge the overlapping specifications,' said Julson..." See: (1) the announcement: "Sun Microsystems Joins WS-I. Web Services Leader Plans to Run for Board Election."; (2) "Web Services Interoperability Organization (WS-I)."

  • [October 23, 2002] "The Goals Are Clear -- How To Get There?" By Katherine Burger. From CMP Insurance and Technology Magazine (October 23, 2002). "If insurers want to succeed and profit in today's competitive, cost-conscious financial services arena, they are going to have to transform the ways they do business. Industry-specific standards are a key aspect of any strategy for getting closer to customers and becoming more efficient, but deploying them has been a challenge. Strategies for standards success were revealed at last month's Insurance Standards Leadership Forum in New York City, co-produced by CMP Insurance & Technology and ACORD. 'A shared vision is important,' warned keynote speaker Richmond Waller, executive vice president of e-business, Zurich North America (Schaumburg, IL). That vision includes a 'shared business mind-set, shared logic, shared language, shared processes, and shared measurement of your business value.' ... When it comes to collaboration within the insurance industry among different parties, 'if we don't change, we are in trouble,' says speaker Chris Milton, vice president and reinsurance officer at AIG (New York) -- referring to the insurance industry's need to adopt Internet standards to increase transparency. A 'new economy' and 'new business models for self service' require that carriers adopt Internet standards, such as ACORD's XML standards, in order to improve efficiency, create strategic partnerships, improve customerretention, expand into new markets and increase transparency -- 'which is vital for future success,' Milton told the conference attendees. Central to overcoming some of the obstacles, Milton says, are XML standards, which will allow carriers to first create transparency in their internal business silos by allowing the creation of enterprise data warehouses. Carriers will then eventually be able to interact with business partners, such as reinsurance companies, by being able to transmit data in XML-standard format -- eliminating the need for re-keying of data... One organization that has made a serious commitment to implementing XML is New York-based MetLife, and the firm' director, enterprise technology, Charlie Dietz, told attendees 'XML is a recent development but already an integral part of our business,' in areas such as agent portal links, institutional business portals and key infrastructure systems. Dietz describes himself as 'an XML evangelist' and he advised executives to develop a corporate strategy 'for how to use standards, which ones you are going to use, and how you are going to use them'..."

  • [October 23, 2002] "Phaos Looks to Boost Liberty Single Sign-On." By Paul Krill. and James Niccolai. In InfoWorld (October 22, 2002). "Phaos Technology this week [has] the Phaos Liberty Toolkit to enable developers to build applications adhering to the Liberty Alliance single sign-on specification for federated network identity. With the toolkit, Java developers can build applications that enable single sign-on capabilities, support the consolidation of enterprise authentication schemes, and allow migration from legacy infrastructure to XML-based Web services, Phaos said. The toolkit features integrated XML digital signatures and XML encryption as well as privacy and identity mechanisms by integrating hardware operations. The toolkit is expected to be used for federated authentication applications, such as those used for trading relationships among buying entities, manufacturers, and end-consumers, said Roger Sullivan, president of Phaos. It represents an advancement over PKI, according to Sullivan. 'The problem with PKI is that, in order to proliferate PKI, everybody's got to have a card, a certificate,' Sullivan said. 'Until everybody has certificates it can't begin to initiate my trading relationship. The SAML component [of Liberty] provides assistance with this. It is to some degree self-authenticating -- if we agree on a relationship and some rules and you and I can set up a trading relationship with our subordinate employees who are authenticated by the rules that you and I have set up -- I don't have to ping that third-party every transaction. The user presents their credentials as part of the buying transaction. I believe this will help proliferate authentication.' The toolkit expands on the earlier Phaos Liberty SDK with a fully integrated security library. It supports XML, SAML, and SSLava toolkits... Phaos this week also announced Phaos XML Toolkit 2.0, a Java toolkit for building interoperable and secure XML-based applications that benefit from code portability and scalability of Java. Also released was Phaos SAML 1.0, providing a protocol consisting of XML-based request and response message formats to communicate assertions of an entity's attributes, authentication, and authorization..." See: (1) the Phaos Technology Corp. announcement: "Phaos Technology Releases Liberty Toolkit 2.0. e-Security Provider Offers Developer Toolkit for Liberty Alliance Specifications."; (2) "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [October 23, 2002] "Phaos Releases Liberty Identity Toolkit." By Richard Karpinski. In InternetWeek (October 22, 2002). "Phaos Technology [has] released a toolkit that lets developers build applications based on the digital identity specifications of the Liberty Alliance. Liberty is a multi-vendor consortium building federated identity management and single sign-on specifications. The group released its first specifications earlier this year. The Phaos Liberty Toolkit supports Liberty's sign-on authentication and authorization specifications. The Java-based toolkit lets developers build single sign-on support into apps, and supports the consolidation of multiple enterprise authentication schemes via new XML-based Web services architectures. The toolkit also supports XML digital signatures and XML encryption... Phaos also announced today the release of the Phaos XML Toolkit 2.0, a Java toolkit for building secure XML-based apps, as well as Phaos SAML 1.0, which provides a protocol to communicate assertions of an entity's security attributes, authentication, and authorization..." See: (1) the Phaos Technology Corp. announcement: "Phaos Technology Releases Liberty Toolkit 2.0. e-Security Provider Offers Developer Toolkit for Liberty Alliance Specifications."; (2) "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [October 22, 2002] "Carriers Worry about XML Fragmentation." By Gregory MacSweeney. From CMP Insurance Technology. October 22, 2002. "XML may be powerful, easy to work with, and could change business transactions forever, but even with XML standards being developed in many industries -- including ACORD's standards for insurance -- the rapid adoption may lead to XML fragmentation as many companies are developing first and checking standards later. With all of the XML hype, it is no wonder that many companies -- such as MetLife, Fidelity, Reuters and FedEx -- have already launched enterprise-wide XML initiatives. But since many standards are not finalized, and many executives 'can't afford to wait' as business pressures mount, it is important to take a closer look at how XML is being used... And even with the ACORD (Pearl River, NY) XML standards, which Wroe describes as 'very strong,' rapid development may lead to fragmentation. 'One of XML's strengths is it is so easy to work with,' Wroe adds. 'It may allow people to be overly creative. There can be semantic fragmentation that can start to cut back on capability.' Susan Ousey, senior vice president, ACORD, adds, 'You can enable a lot of applications very quickly with XML. But there could be fragmentation in the corporate XML infrastructure. Companies need to find a way to take advantage of XML as a developing technology and at the same time build and manage it so there are no problems down the road.' ... To address some of the concerns in insurance, ACORD's Ousey recently attended the Lighthouse Council Summit, an event sponsored by Swingtide, where leaders from AON Corp., Chubb Corp., MetLife and Northwestern Mutual Insurance, among others, gathered to discuss issues related to developing with XML... The attendees concluded that although there is a definite ROI case for using XML, there were come concerns. The attendees worried about the future interoperability of XML, security and compliance and performance measurement activity between/among XML services, according to Swingtide's Wroe..." For information on XML-based standards in the insurance industry, see the XML.org Focus Area on Insurance.

  • [October 22, 2002] "The XPointer xpath1() Scheme." By Simon St.Laurent (O'Reilly & Associates). IETF Network Working Group, Internet-Draft. Reference: 'draft-stlaurent-xpath-frag-00.txt'. October 20, 2002, expires April 20, 2003. Also in HTML format. "This document specifies an xpath1() scheme for use in XPointer-based fragment identifiers. This scheme, like other XPointer Framework schemes, is designed primarily for use with the XML Media Types defined in RFC 3023, to identify locations within a given XML representation of a resource. The xpath1() scheme uses XPath 1.0 syntax... As the W3C's xpointer() scheme already provides a superset of the functionality provided by the xpath1() scheme, some consideration of why the xpath1() scheme is useful seems worthwhile. The xpointer() scheme, designed to support the out-of-line linking capabilities of XLink, provides support for character ranges which may arbitrarily cross node boundaries. While this is extremely useful for many hypertext applications, it is unnecessary for a wide variety of simpler projects, and XPath 1.0 is generally far more widely supported than the xpointer() scheme. While the XPointer Framework explicitly supports multiple levels of conformance, the xpointer() scheme states that 'Conforming XPointer processors claiming to support the xpointer() scheme must conform to the behavior defined in this specification and may conform to additional XPointer scheme specifications.' Conforming xpointer() processors must implement both XPath and the xpointer() scheme's own extensions, and while applications might use only the subset of xpointer() that is pure XPath, processors built for that approach are non-conformant. The XPointer set of specifications also includes shorthand pointers (based on ID values with their own complications) and support for an element() scheme that is effectively a subset of XPath, but these offer considerably less functionality than XPath. The xpath1() scheme strikes a balance between the simple implementation but limited functionality of shorthand pointers and the element() scheme, and the complex implementation but great capabilities of the xpointer() scheme. Perhaps more importantly, it strikes that balance using processing capabilities that are already widely deployed..." Simon notes: "Future drafts will include examples, but I suspect the readers of this list know what XPath 1.0 expressions look like. This Internet-Draft has no connection to the W3C XLink WG except insofar as it builds on the XPointer Framework and uses other W3C work as references." See "XML Linking Language." [cache]

  • [October 22, 2002] "Understanding WS-Security." By Scott Seely (Microsoft Corporation). From MSDN Library. October 2002. ['This article looks at how to use WS-Security to embed security within the SOAP message itself, exploring the concerns WS-Security addresses: authentication, signatures, and encryption. This article assumes that you are already familiar with XML Canonicalization, XML Signature, and XML Encryption.'] "... the bigger problems involve sending the message along a path more complicated than request/response or over a transport that does not involve HTTP. The identity, integrity, and security of the message and the caller need to be preserved over multiple hops. More than one encryption key may be used along the route. Trust domains will be crossed. HTTP and its security mechanisms only address point-to-point security. More complex solutions need end-to-end security baked in. WS-Security addresses how to maintain a secure context over a multi-point message path. WS-Security addresses security by leveraging existing standards and specifications. This avoids the necessity to define a complete security solution within WS-Security. The industry has solved many of these problems. Kerberos and X.509 address authentication. X.509 also uses existing PKI for key management. XML Encryption and XML Signature describe ways of encrypting and signing the contents of XML messages. XML Canonicalization describes ways of making the XML ready to be signed and encrypted. What WS-Security adds to existing specifications is a framework to embed these mechanisms into a SOAP message. This is done in a transport-neutral fashion. WS-Security defines a SOAP Header element to carry security-related data. If XML Signature is used, this header can contain the information defined by XML Signature that conveys how the message was signed, the key that was used, and the resulting signature value. Likewise, if an element within the message is encrypted, the encryption information such as that conveyed by XML Encryption can be contained within the WS-Security header. WS-Security does not specify the format of the signature or encryption. Instead, it specifies how one would embed the security information laid out by other specifications within a SOAP message. WS-Security is primarily a specification for an XML-based security metadata container... WS-Security allows for a SOAP message to identify the caller, sign the message, and encrypt message contents. Whenever possible, existing specifications are reused to reduce the amount of invention required to securely deliver a SOAP message. Because all of the information is delivered within the message itself, the message becomes transport neutral. The message would be secure if it was delivered by HTTP, e-mail, or on CD-ROM..." See: "Web Services Security Specification (WS-Security)."

  • [October 22, 2002] "XML-Packed Office 11 Goes Into Beta." By Matt Berger. From ITworld.com (October 22, 2002). "The next version of Microsoft Corp. Office, intended as a more corporate-focused version of its productivity application suite, has been delivered to a few thousand early beta testers, the company said Tuesday. Code-named Office 11, the software is being designed to include wide support for the industry standard data format XML (Extensible Markup Language), said David Jaffe, lead product manager for Office. For one, users will be able to save Word or Excel files in XML, which will allow the data inside those files to be shared with any other software that also supports the standard file format. Word and Excel also will be able to retrieve XML data from any number of sources, including the Web and a company's internal data resources, Jaffe said. One new feature being added to Office 11 that makes use of XML is called Smart Documents. It is a programmable task pane that can be customized to display information that is stored on the Web or on a company's internal network. Similar to the Smart Tags feature included in Office XP, Smart Documents is context sensitive, in that it will display data that is relevant to specific information inside a document... The move toward XML is part of a broader effort at Microsoft, embodied in its .Net initiative, to allow customers to access data, services and applications from disparate computer systems on a variety of computing devices. Microsoft recently detailed a new application that will join the Office family called XDocs, which relies solely on the XML file format. That software is being designed for corporate users to build forms that collect and distribute data in XML. For example, it could act as a user-facing interface for inputting data into a customer relationship management database or other back-end computer systems..." See further: (1) the announcement: "Microsoft Releases First Beta of 'Office 11'. Next Version of Office to Connect People, Information and Business Processes."; (2) "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas."

  • [October 22, 2002] "Microsoft Unleashes Office 11 On Beta Testers." By Ed Scannell. In InfoWorld (October 22, 2002). "Microsoft will deliver to selected beta testers on Tuesday an early version of its long-awaited Office 11 desktop suite that will feature versions of Word, Excel, and Access that fully support XML... With the added XML support, users can now create more dynamic documents that inherently contain XML code, allowing them to be more easily accessed and shared among a wide range of users. 'Users can build smart documents that allow them to start building solutions from within a Word document, rather than starting to build solutions starting from outside the document, which is the way it is done now,' Marks said. As one example, Marks said Excel users doing an expense report can now directly connect to multiple internal and external data sources, such as a database containing credit card information, and import that data directly into the expense report and then share it across multiple environments. Similarly, users can connect business processes involving much more sophisticated applications, such as CRM, in order to connect transactions more quickly as well as establish more efficient communications with business partners and suppliers... Microsoft is also building into each of the Office applications the Research Task Pane, an XML-based capability that allows users to do searches of any XML-based data source available on the Web from within a Word document. Trying to improve the suite's collaborative capabilities, the new betas will feature tighter connections to the company's SharePoint Team Services 2.0, which is scheduled to be delivered around the same time as Office 11, namely mid-2003. Microsoft is also building into the applications many of SharePoint's tools that company officials hope will encourage users to do both more lower-level, ad hoc collaborations, as well as providing better access to SharePoints custom Web sites where they can, for instance, manage on-line meetings... The new suite will also feature built-in instant messaging, allowing users to be aware of who is coming on or going offline from within a document. Users can also kick off a session from within a spreadsheet or word processor. Office 11 will also feature an easier-to-use Outlook mail client that allows users to more easily sort through thousands of e-mails. The product will feature a much larger area on-screen dedicated to the e-mail's content, making it easier for users to read. Outlook also now has the ability to detect a slow online connection and will automatically download only the necessities of a file, such as its content, but eliminate things such as headers..." See the announcement: "Microsoft Releases First Beta of 'Office 11'. Next Version of Office to Connect People, Information and Business Processes."

  • [October 22, 2002] "ECDSA with XML-Signature Syntax." IETF Internet Draft. By Simon Blake-Wilson (BCI), Gregor Karlinger (Chief Information Office Austria), and Yongge Wang (University of North Carolina at Charlotte). June 30, 2002. expires December 31, 2002. Reference: 'draft-blake-wilson-xmldsig-ecdsa-03.txt'. Appendix A provides the Aggregate XML Schema; Appendix B provides the Aggregate DTD. "This document specifies how to use ECDSA (Elliptic Curve Digital Signature Algorithm) with [IETF/W3C] XML Signatures. The mechanism specified provides integrity, message authentication, and/or signer authentication services for data of any type, whether located within the XML that includes the signature or included by reference... The Elliptic Curve Digital Signature Algorithm (ECDSA) is the elliptic curve analogue of the DSA (DSS FIPS 186-2) signature method. It is defined in the ANSI X9.62 standard... Like DSA, ECDSA incorporates the use of a hash function. Currently,the only hash function defined for use with ECDSA is the SHA-1 message digest algorithm. ECDSA signatures are smaller than RSA signatures of similar cryptographic strength. ECDSA public keys (and certificates) are smaller than similar strength DSA keys, resulting in improved communications efficiency. Furthermore, on many platforms ECDSA operations can be computed faster than similar strength RSA or DSA operations. These advantages of signature size, bandwidth, and computational efficiency may make ECDSA an attractive choice for XMLDSIG implementations." General references in "Digital Signatures." [cache]

  • [October 22, 2002] "Tip: Use a SAX Filter to Manipulate Data. Change the Events Output by a SAX Stream." By Nicholas Chase (President, Chase and Chase, Inc). From IBM developerWorks, XML zone. October 2002. ['The streaming nature of the Simple API for XML (SAX) provides not only an opportunity to process large amounts of data in a short time, but also the ability to insert changes into the stream that implement business rules without affecting the underlying application. This tip explains how to create and use a SAX filter to control how data is processed.'] "This tip looks at an application that determines which employees to notify of a particular emergency situation, and then acts accordingly The tip demonstrates a simple way to alter the processing of a SAX application using an XML filter. In this case, the filter has been pre-determined, but you can build an application to accomodate different situations by choosing filter behavior at run-time. You might accomplish this by replacing the DataFilter class, by passing a parameter at run-time, or even by using a factory to create the filter class in the first place. A SAX application can also chain filters together so that the output of one filter is used as the input for another, allowing for complex programming in modular chunks..." Has source code.

  • [October 21, 2002] "MusicXML in Practice: Issues in Translation and Analysis." By Michael Good (Recordare LLC). Originally published as pages 47-54 in Proceedings of the First International Conference MAX 2002: Musical Application Using XML (Milan, September 19-20, 2002); see the conference website. "Since its introduction in 2000, MusicXML has become the most quickly adopted symbolic music interchange format since MIDI, with support by market and technology leaders in both music notation and music scanning. This paper introduces the key design concepts behind MusicXML, discusses some of the translation issues that have emerged in current commercial applications, and introduces the use of MusicXML together with XML Query for music analysis and information retrieval applications... MusicXML has built on the collective work of the XML and music representation communities to become the most widely adopted symbolic music interchange format since MIDI. As the language develops, it will encounter further challenges in the areas of translation and analysis. Our commercial experience to date bodes well for handling translation issues as MusicXML expands to include more data for tablature, percussion notation, and sequencer applications. Our recent XQuery experience gives us new hope that industry-standard XML database tools, combined with MusicXML-based representations, will provide powerful new tools for problems in musical data analysis and information retrieval..." General references in "XML and Music."

  • [October 21, 2002] "Digital Photo Print Standard Progresses." By David Becker. In CNET News.com (October 21, 2002). "Major camera makers and other digital photography companies are pushing forward with a program to help consumers get prints from their neighborhood drugstore. The International Imaging Industry Association (I3A) -- a nonprofit trade group supported by Eastman Kodak, Hewlett-Packard, Fujifilm and others -- announced plans for the Common Picture Exchange Environment (CPXe) earlier this year. CPXe will consist of an online directory maintained by I3A that will help consumers quickly find photofinishers in their neighborhood that process digital photos, plus software standards to enable cameras and photo applications to access the directory and transfer images. HP and Kodak representatives attending last week's Digital Imaging '02 forum here said work on CPXe is moving along quickly. The software standard should be ready by December, said Mark Cook, vice president of strategic initiatives for Kodak. That launch date should allow camera makers and others to build CPXe support into their products starting next year..." See project details in "Common Picture Exchange Environment (CPXe)."

  • [October 21, 2002] "Tools Help to Ease Web Services Development." By Darryl K. Taft. In eWEEK (October 21, 2002). "M7 Corporation Inc. and Novell Inc. are rolling out tools to help developers build and deploy Web services... M7 this month unveiled Version 2.0 of Application Assembly Platform, which offers an enterprise object repository, support for Web services and new workflow capabilities. The enterprise object repository lets developers define and reuse business rules, processes and objects and unite existing technology with new technology, including Web services, according to officials with the Cupertino, Calif., company. The Web services support in the product lets developers plug Web services into applications and reuse them. M7's enhanced workflow engine has a visual editor that can access the repository. In addition, M7 supports JavaServer Pages 2.0 as well as BEA Systems Inc.'s WebLogic 7.0; IBM's WebSphere 4.0; and JBoss 2.4, open-source software from JBoss Group LLC. M7 CEO Mansour Safai said M7 2.0 helps shield developers from some of the complexity of J2EE (Java 2 Enterprise Edition... Novell, of Provo, Utah, this month unveiled Novell Extend 4 Enterprise, its development environment for Web applications and XML Web services. Key enhancements in Novell Extend 4 Enterprise are support for J2EE 1.3 and the IBM and BEA application servers, the company said. Meanwhile, Actional Corp. announced partnerships with two companies aimed at easing Web services development. Actional, of Mountain View, Calif., and WebPutty Inc., of San Jose, Calif., are teaming to enable developers to integrate enterprise applications from such vendors as SAP AG, Siebel Systems Inc. and PeopleSoft Inc. with their own custom products. The agreement will enable developers to use Actional's SOAPswitch to expose their applications as Web services and then use the WebPutty Application Platform to add new functionality to the enterprise application, the companies said..."

  • [October 21, 2002] "Progress Buys XML Toolmaker eXcelon." By Scarlet Pruitt. In InfoWorld (October 21, 2002). "E-Business technology provider Progress Software is purchasing XML (Extensible Markup Language) toolmaker eXcelon in a move that will allow the company to expand its e-business application integration products. Progress said Monday that both companies' board of directors approved the $24 million all-cash purchase, which is expected to close within 90 days, pending approval by eXcelon stockholders. The buy is aimed at accelerating the product strategy forged by Progress subsidiary Sonic Software, which launched a distributed, standards-based integration product dubbed SonicXQ earlier this year. SonicXQ has been called an 'Enterprise Service Bus,' which connects Web applications for e-businesses, allowing the applications to more easily communicate with each other, a Progress spokeswoman said. With the purchase of eXcelon, Sonic Software will gain the company's Stylus Studio, eXtensible Information Server (XIS) and Business Process Manager (BPM) technologies, providing components that will expand SonicXQ's platform. Stylus Studio is an integrated development environment designed to allow XML developers to create and test XSL (Extensible Stylesheet Language) stylesheets, as well as perform XML-to-XML mappings. BPM is an XML-based business document rules engine, whereas eXcelon's XIS is an XML database management system..." See the announcement Progress Software Corporation Signs Definitive Agreement to Acquire Excelon Corporation. Acquisition Accelerates Corporate Strategies for Service Oriented Architecture, Standards-Based Integration, and XML Capabilities."

  • [October 21, 2002] "Price Modeling in Standards for Electronic Product Catalogs Based on XML." By Oliver Kelkar (Fraunhofer IAO, Germany), Jörg Leukel (Universiy of Essen, Germany), and Volker Schmitz (Universiy of Essen, Germany). Pages 366-375 in Proceedings of the 11th International World Wide Web Conference (WWW 2002), May 7-11, Honolulu, Hawaii, USA. "The fast spreading of electronic business-to-business procurement systems has led to the development of new standards for the exchange of electronic product catalogs (e-catalogs). E-catalogs contain various information about products, essential is price information. Prices are used for buying decisions and following order transactions. While simple price models are often sufficient for the description of indirect goods (e.g. office supplies), other goods and lines of business make higher demands. In this paper we examine what price information is contained in commercial XML standards for the exchange of product catalog data. For that purpose we bring the different implicit price models of the examined catalog standards together and provide a generalized model... The aim of many pricing strategies is to sell equal or similar products to different customers paying different prices. Differential pricing tries to gain higher profits in imperfect markets. For this to be effective the market must be divisible and the different segments must have different levels of demand. In general we can assume seven types of pricing: (a) Individual Pricing; (b) Product Form Pricing; (c) Quantity Pricing; (d) Bundled Pricing; (e) Customer Segment Pricing; (f) Geographical Pricing; (g) Promotional Pricing... Starting points for answering the question, which price information is modeled in catalog standards, are the specifications, documentations and (if available) the data models. The analysis covered 20 XML standards in all. Due to limited space we will concentrate on six selected standards. The selection contains the most important standards being used in B2B ecommerce: (1) cXML and xCBL: two standards developed by major ebusiness software companies; (2) BMEcat: genuine catalog standard developed in Germany; (3) EAN.UCC: standard by EAN International; (4) OAGIS: documents will be integrated into the ebXML framework - ee refer to the document ECATALOG; (5) RosettaNet: a horizontal standard. The result of our empirical analysis is a general price model, which will be presented as an XML Schema... We found that the spectrum of real world price models is covered in a limited way by available standards. Speaking of the suppliers and buyers, it is necessary to represent more complex price models in catalog documents. For example, the industrial trade uses multi-staged discount systems along the trade levels. As long as this is not covered, we see a major obstacle for the fast success of e-business applications. On the other hand it is necessary to reduce the complexity of price models to be able to develop, deploy and handle a generalized model. Our model can be used as an intermediate standard for the transformation of different price models. It shows that the definition of mapping statements is complex due to the complexity of price models; it requires special domain knowledge and leads to non-trivial mapping rules... Besides price modeling, the representation of complex goods is another unsolved problem. This underlines our conviction that further research and standardization must be done to come to universal and accepted business documents..." See also the presentation slides. [cache]

  • [October 21, 2002] "A Modeling Approach for Product Classification Systems." By Jörg Leukel, Volker Schmitz, and Frank-Dieter Dorloff (Department of Information Systems, University of Essen, Germany). Pages 868-874 in Proceedings of the Second International Workshop on Electronic Business Hubs: XML, Metadata, Ontologies, and Business Knowledge on the Web (WEBH 2002 / DEXA 2002), September 2-6, 2002, Aix-en-Provence, France. "Standardized product classification systems play a major role for searching and comparing offered products on electronic markets. Especially in case of large multi-vendor product catalogs classified data becomes an important asset and success factor. The most known systems are UNSPSC and eCl@ss, however they are still developing, and new systems are emerging as well. Classification systems differ not only in content but also in structure from each other. The management and exchange of the systems between market partners must be able to get along with these differences. A common structure model, which can be used to specify XML business documents, is missing so far. This paper discusses the design of classification systems and develops a data model using XML Schema. The model can be used for the transmission of classification systems, thus it is an innovative extension of existing product catalog standards... We developed an XML Schema that covers all design parameters and is able to describe all classification systems..." See also the following reference and the publications listing from the BLI (Beschaffung, Logistik und Informationsmanagement) research team, University of Essen. [cache]

  • [October 21, 2002] "Modeling and Exchange of Product Classification Systems Using XML." By Jörg Leukel, Volker Schmitz, and Frank-Dieter Dorloff (Department of Information Systems, University of Essen, Germany). In Proceedings of the Fourth IEEE International Workshop on Advanced Issues of E-Commerce and Web-Based Information Systems (WECWIS 2002) (June 26 - 28, 2002; Newport Beach, California). "... This paper discusses the design of classification systems and argues to develop standardized messages using XML schema for the transmission of classification systems... The result of our empirical analysis of classification systems and XML catalog standards is a set of design parameters. These parameters are divided into the areas attributes and classification groups. Table 1 shows which design parameters are implemented in four selected product classification systems: eCl@ss, ETIM, RosettaNet Technical Dictionary (RNTD), and EGAS that adds sets of attributes to UNSPSC. ETIM and RNTD are vertical systems developed for the wholesale of electro technical products respectively for electronic and IT components. The systems itself are documented by non-formal and formal specifications. Though only RNTD is specified by an XML-document. All other systems use simple Excel or comma-separated value (CSV) files as containers and provide very few semantics... We observe that many catalog standards are confined to the classification of products by giving a reference to the classes and attributes. cXML, eCX, and EAN.UCC belong to this group... ebXML is a framework and does not specify business documents. In contrast the following standards provide special document or data elements definitions for classification systems: BMEcat, OAGIS and xCBL... All things considered, none of the four selected industrial classification systems realizes all design parameters. The systems themselves are documented quite differently. The system specifications are often provided in proprietary formats; hence their processing in catalog systems is less automated... To solve the described problems we will develop an XML model that covers all design parameters and is able to describe all classification systems. The benefit using XML Schema language instead of ERM, UML or RDF is providing a format immediately, which can transfer real classification systems in all details..." [cache]

  • [October 21, 2002] "Stage Set for Wireless, PDA Web Services Specs." By Vance McCarthy. From Integration Developer News. October 21, 2002. "After a feverish summer agenda, the SyncML Initiative has pulled together a number of key specs and test results for setting standards for the 'mobile extended enterprise.' The work on mobile services interoperability and management wraps up just as SyncML joins the Open Mobile Alliance (OMA), the newly created federation of mobile standards groups. 'The interoperable tests for device management we had last month were very successful,' Doug Heintzman, chairman of the SyncML Initiative told IDN. 'We are just one step away from declaring interoperable products under the Device Management spec.' That will likely be done next month, during SyncML's November meeting in Europe, he added... Just in the last few months, Oracle Corp. has become much more hands-on with SyncML's work... SyncML's DataSync, which has been out since December 2000, is now 'very robust,' Heintzman said, and a new wave of more robust testing tools has come available. More than 100 devices now comply with the DS spec (on both the client and the server side). These products come from Ericsson, IBM, Lotus, Oracle, Matsushita, Motorola, Nokia, Openwave, Starfish Software, and Symbian, among others. As for SyncML's Device Management 1.0 spec, the success from those interoperability tests (held last month is Las Vegas) 'gave the DM committee the confidence to sit down for another 2 1/2 weeks beyond the SyncFest to do the final thing clean up and refer the DM spec and interoperability test suite to the board for final review.' That review will be completed as early as next week, Heintzman added... Starfish Software, a founder of SyncML, just had its latest versions of its TrueSync data synchronization products SyncML DataSync 1.1 certified. Starfish focuses its business on providing data synchronization capabilities to software providers, including IBM's Websphere and PeopleSoft. 'The big guys are making it a core part of their product, and there's reason to assume that data synchronization will be an even bigger differentiator when it comes to supporting mobile workers and wireless clients,' Diane Law, Starfish's director of marketing told IDN. Law puts it this way: 'Sync ML's Device Management specification solves much the same problem for handhelds and PDAs as we had for laptops year ago. It helps answer the questions: 'What's on these devices? How to I get access to that data, application or whatever?,' she said, adding that once the SyncML device management spec gets broad support it will spur IT's interest in mobile use. 'By having over-the-air device management, and access to the device we can update, synchronize, and even for security purposes lock-up and retrieve the data,' Law said. Starfish also has APIs to support synchronization from Microsoft Exchange and Lotus' Domino. The company also has the capability to support Microsoft CE clients from a Java or Windows back-end server, she added. Starfish is working with OEMs, and Law said she expects further added announcements later this year. Despite the SyncML's momentum, Heintzman doesn't expect a complete 'rubber stamp' of their work to date... See "The SyncML Initiative."

  • [October 18, 2002] "First Look at the WS-I Basic Profile 1.0. Features of the Profile." By Chris Ferris (Sr. Software Engineer, IBM). From IBM developerWorks, Web services. October 2002. ['The Web Services Basic Profile 1.0 released by the Web Services Interoperability group represents an important milestone for the technology as a published description of what standards and technologies will be required for interoperability between Web services implementations on different software and operating system platforms.'] "On October 17, 2002, WS-I, the Web Services Interoperability Organization, released its first public Working Group Draft of the WS-I Basic Profile version 1.0 specification. This publication represents an important milestone for WS-I and the Basic Profile Working Group, one of the three initial WS-I technical Working Groups chartered with deliverables associated with the WS-I Basic Profile version 1.0. This draft document, while not complete, does represent the consensus of the members of the WS-I Basic Profile Working Group. It is expected that the document will undergo further change to incorporate more examples and more detailed rationalization for the constraints and requirements imposed by the profile... The WS-I Basic Profile 1.0 specification is a rather complex document. You might ask; 'I develop Web services for a living, what relevance has this specification to my work? Do I need to read, and understand all of this material?'. A majority of the specification is targeted at the audience of platform infrastructure and tools developers working on vendor-specific implementations of SOAP processors, WSDL parsers, code generators, and the like. The specification represents the consensus agreement of the members of the Basic Profile Working Group. Since the WG members include people (such as me) who represent platform and/or tool vendors, you could reasonably look at this document as a concerted effort by those tools and platform vendors to ensure that their respective products will either generate, or host interoperable Web services instances. This means that while you'll probably want to be familiar with all of the profile specification's contents there are specific sections you will need to pay close attention to as you implement Web services. We will examine each substantive section of the profile specification and discuss its relevance to a Web service practitioner. Section 4 of the WS-I Basic Profile specification relates to SOAP and use of HTTP binding for SOAP. As such, it is mostly of interest to those developers writing SOAP processor implementations rather than Web services developers. Section 5 pertains to conformant use of WSDL, and as such should be of interest to Web services practitioners, especially those who hand-craft their WSDL descriptions. We will explore some of this section's particularly interesting aspects below. Section 6 pertains to Web service discovery using UDDI. This, too, should be of interest to Web services practitioners. It describes conformant approaches to registration and categorization of a Web service in a UDDI registry. Section 7 relates to security of Web services using HTTP/S and should also be of interest to Web services practitioners who require security for the Web services they develop..." Article also in PDF format. See the 2002-10-18 news item "Web Services Interoperability Organization Publishes Basic Profile Version 1.0" and general references in "Web Services Interoperability Organization (WS-I).".

  • [October 18, 2002] "What is XQuery?" By Per Bothner. From XML.com (October 16, 2002). ['Article in XML.com's series of primers on core XML technologies. Per Bothner introduces the W3C's XML query language, XQuery 1.0. XQuery is designed to enable the query and formatting of XML data. Per provides an overview of XQuery's features and resources for learning more.'] "The W3C is finalizing the XQuery specification, aiming for a final release in late 2002. XQuery is a powerful and convenient language designed for processing XML data. That means not only files in XML format, but also other data including databases whose structure -- nested, named trees with attributes -- is similar to XML. XQuery is an interesting language with some unusual ideas. This article provides a high level view of XQuery, introducing the main ideas you should understand before you go deeper or actually try to use it.. The first thing to note is that in XQuery everything is an expression which evaluates to a value. An XQuery program or script is a just an expression, together with some optional function and other definitions. So 3+4 is a complete, valid XQuery program which evaluates to the integer 7. There are no side-effects or updates in the XQuery standard, though they will probably be added at a future date. The standard specifies the result value of an expression or program, but it does not specify how it is to be evaluated. An implementation has considerable freedom in how it evaluates an XQuery program, and what optimizations it does.. XQuery borrows path expressions from XPath. XQuery can be viewed as a generalization of XPath. Except for some obscure forms (mostly unusual 'axis specifiers'), all XPath expressions are also XQuery expressions. For this reason the XPath specification is also being revised by the XQuery committee, with the plan that XQuery 1.0 and XPath 2.0 will be released about the same time... One difference to note between XPath and XQuery is that XPath expressions may return a node set, whereas the same XQuery expression will return a node sequence. For compatibility these sequences will be in document order and with duplicates removed, which makes them equivalent to sets. XSLT is very useful for expressing very simple transformations, but more complicated stylesheets (especially anything with non-trivial logic or programming) can often be written more concisely using XQuery..." See "XML and Query Languages."

  • [October 18, 2002] "Beep BEEP!" By Rich Salz. From XML.com (October 16, 2002). ['In this month's XML Endpoints column Rich Salz concludes his look at methods for transporting binary data in SOAP, with an examination of BEEP. Rich ends with a plea to Microsoft's Don Box to consider BEEP to overcome the limitations of HTTP in web services.'] "This article is the last in a series examining how one might go about sending binary data as part of a SOAP message. This month we look at BEEP, the Blocks Extensible Exchange Protocol. The primary inventor of BEEP is Marshall Rose, a long-time 'protocol wonk' within the IETF community. Marshall has authored more than 60 RFC's, covering everything from core SNMP details to a DTD for IETF RFCs. Like SOAP 1.2, BEEP is described in a transport-neutral manner and is defined in RFC 3080. The most common transport is TCP, and the TCP mapping is found in RFC 3081... BEEP actually addresses a wider range of problems than just SOAP and binary data. According to the RFC, BEEP is 'a generic application protocol kernel for connection-oriented, asynchronous interactions.' In other words, it's like an application-level transport protocol. Unlike many application protocols, it supports more than just lockstep request-response interchanges, which makes it suitable for applications such as event logging (see RFC 3195, for 'Reliable Syslog'), and general peer-to-peer interactions. One of the more interesting implications of BEEP's design principals is that it supports multiplexing -- that is, multiple channels communicating over a single transport stream. BEEP allows different applications, or multiple instances of the same application, to use the same stream for independent activities, including independent security mechanisms. For example, a single BEEP-over-TCP link between a browser and a web server would efficiently allow the browser to fetch a page over SSL-encrypted HTTP, while also streaming in multiple images over unencrypted HTTP. A BEEP message is the complete set of data to be exchanged. BEEP defines three styles of message exchange, distinguished by the type of reply the server (or, more accurately, the message recipient) returns: (1) Message/reply, the server performs a task, and sends a positive reply back. (2) Message/error, the server does not perform the task, and sends a negative reply back. (3) Message/answer, the server sends back zero or more answers, followed by a termination indicator. This style is appropriate for streaming data, for example..." See: "Blocks eXtensible eXchange Protocol Framework (BEEP)."

  • [October 18, 2002] "A Tour of 4Suite." By Uche Ogbuji. From XML.com (October 16, 2002). ['in his latest installment of Python and XML, Uche Ogbuji provides a tour of the core XML processing facilities of 4Suite, an XML application platform for Python.'] "Mike Olson and I began the 4Suite project in 1998 with the release of 4DOM, and it quickly picked up an XPath and XSLT implementation. It has grown to include Python implementations of many other XML technologies, and it now provides a large library of Python APIs for XML as well as an XML server and repository system. In this article and the next, I'll introduce just the basic Python library portion of 4Suite, which includes facilities for XML parsing (complementing PyXML), RELAX NG, XPath, XPatterns, XSLT, RDF, XUpdate and more. If you are unfamiliar with any of these technologies, see the resources section at the end where I provide relevant pointers. Finally, after reviewing 4Suite, I'll summarize events in the Python-XML world since the last article... In the general case, the only prerequisite for 4Suite is Python 2.1 or more recent. PyXML is required if you wish to parse XML in DTD validation mode, or if your Python install does not have pyexpat built in (many Python distributions do)..." Note also the O'Reilly publication Python & XML, by Christopher A. Jones and Fred L. Drake, Jr. See: "XML and Python."

  • [October 18, 2002] "The Digital Talking Book." By Ken Pittman. From XML.com (October 16, 2002). ['Ee have an investigation of how XML is being used to implement the Digital Talking Book and enhance talking book facilities available to the visually impaired.'] "The talking book is not new. We've enjoyed talking books for nearly as long as we've had recording devices. The question being asked today is whether the use of XML can permit the Digital Talking Book (DTB) to provide new ways of managing information, particularly for the seeing and seeing-impaired communities? This article will look at the technical elements of DTBs: the standards that facilitate their use, effective information management, copyright and security issues, and a look at future applications. The core application of DTB is the conversion of a book into speech. In predigital applications, a narrator reads the book and the publisher reproduces the narration with a commercial replay device. Analog output provides users with basic functionality: play, fast-forward, stop, and rewind. Any charts, graphs, and visual aides used in the original book are lost in analog, linear narrations. That's especially problematic for scientific texts. Despite significant efforts by Recording For the Blind & Dyslexic and the National Library Service for the Blind and Physically Handicapped (NLS), a division of the U.S. Library of Congress, less than 10% of published books ever make it into an accessible format... The US Congress is considering the Instructional Material Accessibility Act. Still in committee, this bill would require material used in educational institutions to provide or produce an electronic output of the text compatible with a given DTB standard to facilitate a wider use of text book materials within the visually challenged community. Publishers and educational institutions are supporting the bill. Whether the audio portions of books prove to be a marketable opportunity for publishers is yet to be determined. What is clear is that the visually-impaired community is receiving support from the highest institutions to produce much more robust reading solutions for both educational and leisure reading..." See: "NISO Digital Talking Books (DTB)."

  • [October 18, 2002] "Do Web Standards and Patents Mix?" By David Clark. In IEEE Computer Volume 35, Number 10 (October 2002), pages 19-22. "... a debate has erupted between those who say the Internet should be built of freely available, standardized technology components and those who argue that useful technology development should be fairly compensated via royalties or licensing fees. Some observers, such as Tim O'Reilly, founder of O'Reilly & Associates, a technology publisher, say this controversy threatens the Internet's future. They say that patent-based standards requiring royalty payments will slow or discourage adoption of new technologies, except by richer companies, thereby inhibiting development of the Web. They also say that these standards give patent holders too much power over technologies and how they are used. Other observers contend it was inevitable that as the Web evolved into a potential way to generate revenue, companies would want to retain and sell new technologies they develop, rather than releasing them for general use. Moreover, they say, the ability to patent and profit from their work provides an incentive for companies to continue developing new technologies. Recent issues have demonstrated just how controversial the use of patented technologies in computerindustry standards can be. For more information, see the four case study sidebars. [W3C Patent Policy Framework, Web Services, MPEG-4, JPEG] ... According to Professor Mark Lemley of the University of California, Berkeley's Boalt Hall School of Law, standards organizations often require participants to sign an agreement to license, either on royalty-free or reasonable and nondiscriminatory (RAND) terms. However, he noted, 'It is not clear what [RAND] obligations mean in practice.' Meanwhile, most organizations discourage companies from discussing the specifics of their patents or possible licensing terms during the standards development process, to accelerate the proceedings and avoid the appearance of collaboration among working group members. Lemley noted that not discussing licensing details in advance can lead to problems if an organization adopts a patent-based standard but then disagrees with the patent holder over what constitutes reasonable terms. This issue is important because the terms of a standards organization's RAND requirements don't generally specify what 'reasonable and nondiscriminatory' fees are. This can lead to disputes between the organizations and patent holders, explained W3C staff member Philippe Le Hégaret... Some say patents have no place in such a rapidly changing field. Because of the financial burden and usage limitations they create, said JPEG member Richard Clark, CEO of the Elysium Ltd. Web consultancy, 'patents are a direct threat to smaller businesses, individual entrepreneurs, and, in particular, the [opensource] software movement.' Also, he explained, an information-technology patent generally lasts longer than the technology itself, thereby seriously limiting marketplace participation and competition. Other opponents say that patents will halt the kind of free and rapid development needed for tomorrow's Internet. Patents can be exclusionary and thereby restrict the cooperation necessary to evolve core Internet technologies, said Giga's Telleen. Companies have plenty of opportunity to make money without restricting basic Internet-infrastructure innovation and development, he explained. However, MPEG LA's Horn and others say licensing fees can be fair, reasonable, and acceptable, and that the current system for using patented technology in standards will correct itself. 'The market will eventually create the balancing points where reasonable sellers are willing to make patented technology available in open markets to reasonable buyers who are willing to pay a price that they can absorb in their product development,' he explained..." See: "Patents and Open Standards."

  • [October 18, 2002] IPTC NewsML NewsAgency Implementation Guidelines. From IPTC (International Press Telecommunications Council; Comité International des Télécommunications de Presse). Edited/Compiled by David Allen, Managing Director, IPTC. 32 pages. Version 1.1. October 2002. Updated to refer to the NewsML v1.1 DTD and Schema. "NewsML is an extensible management format for news information of all types. It consists of a single XML Document Type Definition (DTD) but allows a number of different types of document instances to be valid against the DTD. The document types are NewsML with contained NewsItem and NewsComponents, TopicSets and Catalogs. A Catalog is used by a provider to give information to users on how to locate the various parts of NewsML that are used in the providers services. The TopicSets provide data in controlled vocabularies to populate those parts of NewsML where vocabulary information is required, in particular DescriptiveMetadata. This Guideline provides advice on good practices for News Agencies using NewsML in their services... NewsML is intended to be used widely for news and has to exist in a standard form to allow providers and recipients to process the information in a consistent manner. NewsML has been Trademarked by IPTC and the published DTD must be used to validate all instances claiming to be NewsML..." Also available in HTML format. See: (1) the NewsML website; (2) the 2002-10-18 news item, "International Press Telecommunications Council Approves NewsML Version 1.1 Specification"; (3) general references in "NewsML." [cache]

  • [October 17, 2002] "Yes, You Can Secure Your Web Services Documents, Part 2. XML Signature Ensures Your XML Documents' Integrity." By Ray Djajadinata. In Java World (October 11, 2002). ['In the first installment of this two-part series, Ray Djajadinata discussed the foundation of confidentiality for WS-Security: XML Encryption. In this second installment, he introduces XML Digital Signature, a standard that handles a document's integrity. He explains the standard, what you should know about it, and shows how to write XML Signature code using an implementation currently available: IBM XML Security Suite.'] "In this article, we look closer at the integrity part of the equation: XML Signature, which turns out to be a bit more complicated than its confidentiality counterpart... The main difference between XML Signature and other digital signature standards such as PKCS#7 (Public-Key Cryptography Standard #7) is that it is XML-aware. That is, with XML Signature, not only can you sign arbitrary digital content as a whole, but you can also sign just specific parts of XML documents. This proves useful when parts of XML documents can be moved, copied, encrypted, signed, and verified in any order, which is quite likely to happen as we move more towards shuffling around XML messages instead of binary payload over the wire. However, as you will see later, XML awareness doesn't come free. Compared to XML Signature, XML Encryption is relatively straightforward. You select some nodes to encrypt, and that's basically it. You decrypt the ciphertext with the correct key, and you retrieve the plaintext -- as simple as that. Not so with XML Signature! XML Signature is more complicated because signatures need to be verified correctly... As of now, XML Signature is already more developed than its cousin XML Encryption. The official Java API for XML Signature should be coming out soon, so be on the look out; JavaOne 2002 featured a high-level overview of the API, and from the look of it, I gather it will be quite easy to learn and use. For example, the API features a class called XMLSignatureMarshaller, which converts XML Signatures to/from XML. If this is not the template feature we've come to know, I don't know what is. The API also includes Transforms, URIDereferencer, and so on, whose names, after reading this article, should clue you in to their functionalities. Last but not least, XML Signature operates under the assumption that key distribution and registration has already been handled. This article's examples are trivial -- we created the key and then used it to verify the signatures. In the real world, things are definitely more complicated. Keys must be registered and distributed properly, and trust relationships must be established. XKMS (XML Key Management Specification), related to Java Specification Request 104, tries to address these issues. In any case, many new and interesting things are emerging from XML security, and with its strong tie to Web services security, you can bet an official Java API will follow soon..." See also Part 1, "XML Encryption Keeps Your XML Documents Safe and Secure." General references in "XML Digital Signature (Signed XML - IETF/W3C)."

  • [October 17, 2002] "Finessing PKI. [Web Services Development.]" By Jon Udell. In InfoWorld (October 17, 2002). "Last week I attended a conference on digital identity, and I came away with some new perspectives on PKI... There is still some reason to think that today's card issuers -- governments, banks -- may yet become tomorrow's identity providers. Phil Windley, CIO of the state of Utah, thinks so. 'Like it or not, we are in the identity business,' he said at the conference. In the United States, the driver's license is the gold standard. Phil told me that he thinks it will eventually morph into a general-purpose smartcard ID which, to avoid the stigma of a national ID, will retain a state affiliation, but in practice will federate. Several other presenters at the conference thought that banks are the more likely source of digital IDs. Either way, the notion is that deployment of digital IDs, by way of these card-issuing entities, will create an identity platform that developers of other kinds of applications can build on. Maybe that will happen, but it's pure speculation for now. Meanwhile, it's getting harder to ignore the flurry of Web SSO (single sign-on) schemes and the emerging consensus around SAML ( Security Assertions Markup Language). Passport is leaning in that direction, and at the conference, Craig Mundie announced that Microsoft will expand its shared source licensing program to facilitate integration with Passport. The SSO scheme proposed by the Liberty Alliance uses SAML to federate identity. So does Shibboleth, which is part of the Internet2 Middleware Initiative and has traction in higher-ed circles. All these systems are agnostic about the method of authentication. The credential can be a name and password, a digital ID, or biometric evidence. If you let go of the idea that a digital ID is a necessary means of authentication, you can think about targeted and strategic deployment of PKI. Another speaker at the conference was Jamie Lewis, CEO of the Burton Group and a leading analyst of enterprise directory services. He suggests an interesting evolutionary path. PKI-based trust relationships might, in the near term, be most practical among institutions rather than individuals. Rather than issue certificates to every employee, for example, a company might certify itself and then sign SAML assertions on behalf of its employees who might still authenticate by name/password..."

  • [October 17, 2002] "Thinking XML: Shedding Light on PRISM. A Standard Metadata Vocabulary for Publishing." By Uche Ogbuji (Principal Consultant, Fourthought, Inc). From IBM developerWorks, XML zone. October 2002. ['PRISM is a standard for metadata related to publishing. It allows the formal description of content and related resources by providing standardized properties, controlled vocabularies, and extensibility mechanisms that enable users to define their own controlled vocabularies. In this column, Uche Ogbuji introduces PRISM by example.'] "The various industries related to publishing were among the earliest to support XML and to explore its value in practice. This is not surprising as the publishing industry has been been a stalwart of SGML, the parent of XML. The Information and Content Exchange protocol, or ICE, emerged in 1998 as one of the earliest major industry standards to use XML. ICE is a protocol for directing the distribution of content electronically to various partners presenting the content on the Internet. XML is well-suited to another important requirement in the publishing industry: content metadata management. ICE provides the mechanism for exchanging content, but even the ICE specification admits that there needs to be a formal means for describing that content. To meet this need, the publishing industry has developed Publishing Requirements for Industry Standard Metadata (PRISM), an XML metadata standard for directing the processing of content. PRISM covers a wide variety of content, from catalogs to books -- and a wide variety of media, from various forms of electronic publishing to various forms of print. PRISM is being developed by a working group of IDEAlliance (formerly known as GCA), a consortium of publishers involved with electronic technological infrastructure. PRISM members include technology vendors such as Adobe Systems, Inc., news agencies such as Reuters, and publishers such as Condi Nast. In this article, I introduce PRISM, focusing on the current draft of the PRISM 1.2 specification. Readers should be familiar with XML and RDF..." See also the following bibliographic entry, the recent announcement, and "Publishing Requirements for Industry Standard Metadata (PRISM)."

  • [October 17, 2002] "PRISM: Publishing Requirements for Industry Standard Metadata." By IDEAlliance and the PRISM Working Group. Specification Version 1.2 (e). First Public Draft. September 4, 2002. 95 pages. "The Publishing Requirements for Industry Standard Metadata (PRISM) specification defines a standard for interoperable content description, interchange, and reuse in both traditional and electronic publishing contexts. PRISM recommends the use of certain existing standards, such as XML, RDF, the Dublin Core, and various ISO specifications for locations, languages, and date/time formats. Beyond those recommendations, it defines a small number of XML namespaces and controlled vocabularies of values, in order to meet the goals listed above. The PRISM working group, a joint effort of representatives from publishers and vendors in an initiative organized under IDEAlliance, prepared this specification... PRISM's scope is driven by the needs of publishers to receive, track, and deliver multi-part content. The focus is on additional uses for the content, so metadata concerning the content's appearance is outside PRISM's scope. The working group focused on metadata for: (1) General-purpose description of resources as a whole (2) Specification of a resource's relationships to other resources (3) Definition of intellectual property rights and permissions (4) Expressing inline metadata -- that is, markup within the resource itself. Like the ICE protocol, PRISM is designed be straightforward to use over the Internet, support a wide variety of applications, not constrain data formats of the resources being described, conform to a specific XML syntax, and be constrained to practical and implementable mechanisms." Version 1.2e Status: "This is the first public draft for the 1.2 release of the PRISM Metadata specification. Several elements have been added, and some existing elements renamed. Many of the definitions of the elements have been rewritten for clarity and precision, and more examples of their use have been provided. Future drafts of the 1.2 specification will add even more examples of the use of the PRISM elements. The 1.2 specification makes a number of changes relative to the 1.0 version: (1) The 'profile' was removed; a new section was added which discusses how this spec may be used by other specifications, such as the article DTD. (2) Additions: the working group decided to add the Section, Page, Volume, Number, IssueName, and Edition elements; the terms stockQuote, newsResult, and portrait were added to the controlled vocabulary of content genre. (3) xml:lang: PRISM has strengthened its recommendation around the use of xml:lang to indicate the language of the metadata record. (4) namespace URLs: PRISM namespaces and vocabularies were updated from 1.0 to 1.2. See: (1) the announcement of September 10, 2002: "PRISM Gains Traction and Announces Companion Specifications. Working Group Members Hearst, LexisNexis, Time Inc. and Others Develop Standardized Markup to Automate Common Publishing and Content Processes."; (2) "Publishing Requirements for Industry Standard Metadata (PRISM)." [cache v1.2e]

  • [October 17, 2002] "XML Schema for Media Control." By Orit Levin (RADVISION), Sean Olson (Microsoft Corporation), and Roni Even (Polycom). IETF Internet Draft. Reference: 'draft-levin-mmusic-xml-media-control-00.txt'. October 2002, expires April 2003. "This document defines an XML Schema for Media Control in a tightly controlled environment. The current version includes commands for managing of video streams only. Implementation of this schema for interactive video applications in SIP environments significantly improves user experience. Both end users and conferencing servers should implement this approach. SIP typically uses RTP for transferring of real time media. RTP is augmented by a control protocol (RTCP) to allow monitoring of the data delivery in a manner scalable to large multicast networks. An RTCP feedback mechanism has been introduced in order to improve basic RTCP feedback time in case of loss conditions across different coding schemes... As the first step, the defined XML document will be conveyed using a SIP INFO method with the Content-Type set to application/xml. This approach benefits from the SIP built-in reliability. The authors plan registering the defined schema with IANA according to the guidelines specified in 'The IETF XML Registry' and issuing a separate SIPPING usage document(s). The document(s) will describe procedures for conveying an XML document defined according to the Schema by means of SIP INFO and SIP NOTIFY. The authors hope that the XML schema, documented in this document, will provide a base for a standard Tight Media Control protocol definition within the IETF. It is expected that in future SIP will define standard means for running this protocol as a part of SIP architecture..." [cache]

  • [October 17, 2002] "WS-I to Expand Its Board." By Stephen Lawson. In InfoWorld (October 17, 2002). "The Web Services Interoperability Organization (WS-I) said Thursday [2002-10-17] will add two seats to its board of directors, a move that may help heal a rift between the group and Sun Microsystems. The group of more than 150 companies, formed in February to foster interoperability of Web services software from different companies, Thursday announced it will elect two new board members next March. The member companies will be elected by the full membership and will have the same rights and responsibilities as the current nine board members, but they will serve limited terms. The nine current board members, including Microsoft and IBM, are permanent. The board expansion was proposed in June and has now been approved by the board's membership. The news brought a guarded welcome from Sun, based in Santa Clara, Calif., which in the past had indicated it wanted to be brought into the group as a permanent board member. WS-I expects to release by the end of this year a set of guidelines for how to use Web services standards in an interoperable way, a set of tools developers can use to test their software, and sample Web services applications. The new board seats were created to let more member companies participate in guiding the organization, said Rob Cheng, chairman of the marketing and communications committee of WS-I. The board is responsible for maintaining the goals and objectives of the organization and creating new working groups, which tackle particular issues, he said. Nominations will be accepted from Jan. 1, 2003 through Feb. 15, 2003. The new board seats will be open to any company that has been a member of WS-I for at least 90 days and belonged to at least one of its working groups for 60 days, Cheng said..." See: (1) the announcement, "WS-I Community Approves Board Expansion. 150+ Member Community Votes to Add Two Elected Board Seats. Nominations Set for January 2003."; (2) general references in "Web Services Interoperability Organization (WS-I)."

  • [October 17, 2002] "Overview of the 2002 IAB Network Management Workshop." By Juergen Schoenwaelder (University of Osnabrueck). IAB Internet-Draft. October 09, 2002. Reference: 'draft-iab-nm-workshop-00.txt'. "This document provides an overview of a workshop held by the Internet Architecture Board (IAB) on Network Management. The workshop was hosted by CNRI in Reston, VA, USA on June 4 thru June 6, 2002. The goal of the workshop was to continue the important dialog which has been started between network operators and protocol developers and to provide advice to the IETF where future work on network management should be focussed. This report summarizes the discussions and lists the conclusions and recommendations to the Internet Engineering Task Force (IETF) community. Some vendors started in the late 1990s to use the Extensible Markup Language (XML) for describing device configurations and for protocols that can be used to retrieve and manipulate XML formatted configurations: (1) XML is a machine readable format which is easy to process and there are many good off the shelf tools available. (2) XML allows to describe structured data of almost arbitrary complexity. (3) The basic syntax rules behind XML are relatively easy to learn. (4) XML provides a document-oriented view of configuration data. (5) XML has a robust schema language XSD for which many good off the shelf tools exists. (6) XML alone is just syntax. XML schemas must be carefully designed to make XML truly useful as a data exchange format. (7) XML is rather verbose. This either increases the bandwidth required to move management information around (which is an issue in e.g. wireless or asymmetric cable networks) or it requires that the systems involved have the processing power to do on the fly compression/decompression. (8) There is a lack of a commonly accepted standardized management specific XML schemas. ... The workshop recommends with strong consensus from the operators and rough consensus from the protocol developers that the IETF/ IRTF should spend resources on the development and standardization of XML-based device configuration and management technologies (such as common XML configuration schemas, exchange protocols and so on)..."

  • [October 17, 2002] "Weekly Review: Microsoft Faces Web Services Threat." By Phil Wainewright. In ASPnews.com (October 14, 2002). ['In securing its position within the enterprise, Microsoft has all but surrendered the one territory that really matters -- the hosted server.'] "XDocs is being billed as a forms-creation tool, but a more accurate description would be to call it an application development tool for information workers. What XDocs does is much more revolutionary than merely creating electronic versions of paper-based forms to be filled in on-screen. The important technology of XDocs isn't on the screen, it's what happens behind it. Building a form in XDocs automatically creates an XML-based query and database structure. Users can build forms that read in data from remote databases, or which collect data that can be saved to a database or passed on to other forms and applications for further processing. When I saw XDocs, I immediately thought of two other technologies that have impressed me recently. One is Macromedia's MX architecture, which makes it easy for developers to create user-friendly complex forms that connect to remote data sources. The other is the connectable modules developed by U.K.-based ASP Xara Online, which allows nontechnical users to construct complex forms, reports and applications using a completely visual interface. But while MX has the back-end connectivity and Xara has the ease-of-use, neither has the promised end-to-end elegance of XDocs, which of course will come conveniently packaged in the familiar Office user interface. Jupiter is notable for unifying the backend servers into which the Office 11 suite of applications -- XDocs included -- will link. Microsoft is very astute in bringing together three separate products to form this single platform. Whereas previously the activities of e-commerce, content management and B2B integration were treated as separate technology propositions, Web services architectures allow all these components to become part of a single business automation infrastructure, co-ordinated by emerging business process technologies such as the joint Microsoft-IBM specification BPEL4WS (Business Process Execution Language for Web Services)..." XDocs references: see the news item of 2002-10-09, "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas."

  • [October 16, 2002] "VoiceXML, CCXML, and SALT." By Ian Moraes. In XML Journal Volume 3, Issue 9 (September 2002), pages 30-34. "There's been an industry shift from using proprietary approaches for developing speech-enabled applications to using strategies and architectures based on industry standards. The latter offer developers of speech software a number of advantages, such as application portability and the ability to leverage existing Web infrastructure, promote speech vendor interoperability, increase developer productivity (knowledge of speech vendor's low-level API and resource management is not required), and easily accommodate, for example, multimodal applications. Multimodal applications can overcome some of the limitations of a single mode application (GUI or voice), thereby enhancing a user's experience by allowing the user to interact using multiple modes (speech, pen, keyboard, etc.) in a session, depending on the user's context. VoiceXML, Call Control eXtensible Markup Language (CCXML), and Speech Application Language Tags (SALT) are emerging XML specifications from standards bodies and industry consortia that are directed at supporting telephony and speech-enabled applications. The purpose of this article is to present an overview of VoiceXML, CCXML, and SALT and their architectural roles in developing telephony as well as speech-enabled and multimodal applications... Note that SALT and VoiceXML can be used to develop dialog-based speech applications, but the two specifications have significant differences in how they deliver speech interfaces. Whereas VoiceXML has a built-in control flow algorithm, SALT doesn't. Further, SALT defines a smaller set of elements compared to VoiceXML. While developing and maintaining speech applications in two languages may be feasible, it's preferable for the industry to work toward a single language for developing speech-enabled interfaces as well as multimodal applications. This short discussion provides a brief introduction to VoiceXML, CCXML, and SALT for supporting speech-enabled interactive applications, call control, and multimodal applications and their important role in developing flexible and extensible standards-compliant architectures. This presentation of their main capabilities and limitations should help you determine the types of applications for which they could be used. The various languages expose speech application technology to a broader range of developers and foster more rapid development because they allow for the creation of applications without the need for expertise in a specific speech/telephony platform or media server. The three XML specifications offer application developers document portability in the sense that a VoiceXML, CCXML, or SALT document can be run on a different platform as long as the platform supports a compliant browser. These XML specifications are posing an exciting challenge for developers to create useful, usable, and portable speech-enabled applications that leverage the ubiquitous Web infrastructure..." See: (1) "VoiceXML Forum"; (2) "Speech Application Language Tags (SALT)"; (3) "Voice Browser Call Control (CCXML)." [alt URL]

  • [October 16, 2002] "Introducing Topic Maps. A Powerful, Subject-Oriented Approach to Structuring Sets of Information." By Kal Ahmed (Techquila). In XML Journal Volume 03, Issue 10 (October 2002). "Topic maps are a standard way of representing the complex relationships that often exist between the pieces of information that we use in day-to-day business processes. This article begins by discussing what topic maps are, what they can do, and what people are currently using them for. However, my main goal is to introduce the basic concepts of topic maps and their representation in XML... Almost all XML vocabularies are designed with a single purpose: to describe information in a way that enables automated processing. We use XML to describe document structures so our documents can be rendered as HTML, WML, PDF, or some other presentation format. We also use XML so that business systems can interchange data reliably. Both the ability to render content to different output formats and the reliability of data interchange arise from a predefined agreement about how individual pieces of markup describe the information they wrap... Current applications can be broken down into three major purposes: (1) A way to improve accessibility to information; (2) A flexible, extensible data structure for standalone applications; (3) An integration layer between existing applications. 'Topic maps for information accessibility' is by far the largest category of current applications of topic maps. Topic maps are being used as the structuring paradigm for portals to all sorts of information. Because they're subject oriented, they provide an intuitive way for users to find their way around large sets of information. Using a topic map, the information provider can create a high-level overview of the concepts covered by the documents. Users can then navigate the overview to find the subject area of interest before accessing the related documents. You can also go the other way - from a document to a list of all subjects in the document and from there to other documents related to the same subject(s). Because all of this structure is explicit in the topic map, navigating from subject to document and back again can be done without requiring the user to construct search terms. Many commercial topic map tools focus on making information more findable and presenting navigation structures for topic maps. Thus, once you have your information structured using topics maps, it's possible to get a configurable, off-the-shelf application to do the work of presenting that structure to the end users. The topic map paradigm is a powerful, subject-oriented approach to structuring sets of information. The building blocks of topic maps are topics that have names and point to occurrences of the subject in external resources. Associations relate topics to each other. Each topic in an association is the player of a defined role. Topics, occurrences, and associations can all be typed by other topics. In addition, topic names, occurrences, and the roles they play in associations can be specified to be valid only in a given context by the use of a scope, which is defined by a collection of topics..." See: "(XML) Topic Maps." [alt URL]

  • [October 16, 2002] "Combining the Power of XQuery and XSLT. Toward Fulfilling the Promise of XML." By Jim Gan (Ipedo). In XML Journal Volume 03, Issue 10 (October 2002). With 5 figures. "Although XSLT and XQuery are designed to meet different requirements, there are similarities between them... I'll briefly compare XQuery and XSLT in terms of data model, content construct, modularization, and expression power... The XML access data model in the match attribute and select attribute of XSLT instructions is based on the XPath 1.0 data model, which is basically a tree structure with nodes. There are four data types in the XPath 1.0 data model: node-set, string, number, and boolean. There are seven XPath nodes: root node, element, text node, attribute node, processing instruction node, namespace node, and comment node. Note that XPath nodes are slightly different from the familiar W3C DOM nodes. There's no namespace node in DOM, nor is there a CdataSection or EntityReference in XPath nodes. The root node in the XPath node model corresponds to a document in DOM. Element, attribute, processing instruction, comment, and text in the XPath 1.0 model correspond one to one to their counterparts in DOM. XQuery is a strongly typed query language, with a data model based on XPath 2.0. It has document node, element node, attribute node, namespace node, processing instruction, comment, and text node. Document node is essentially the same as root node in XPath 1.0. Each node has a node identity concept and a type, which is derived from the corresponding XML Schema definition of the node. Instead of node set, sequence is introduced in the XQuery data model. A sequence is an ordered collection of zero or more items. An item may be an atomic value or a node. Both data models have the same document order concept. The document order of nodes is defined in order of node appearances in the physical XML document... XSLT and XQuery are two key technologies for XML processing. XSLT is good for transforming a whole XML document into another XML document with different schema or into a formatted output for presentation. XQuery can be used for XML-to-XML transformation too, but not for formatting XML documents. Like SQL to relational data, XQuery itself is a powerful language to intelligently search the XML data stored in XML. Another key feature of XQuery is that it can query various data sources as long as they can be viewed as XML. More and more existing non-XML information is being XMLized into XML, and all XML technologies can be used together for various applications. To fulfill the promise of XML, a complete XML platform is needed to support these technologies so as to gain real business benefits out of them. This includes XML parsing, XML Schema validation, XSLT, XQuery, XML data storage, and indexing. Supporting some XML technologies at the toolkit level, or one XML technology at a time, just isn't good enough..." See "XML and Query Languages." [alt URL]

  • [October 16, 2002] "XML and the Semantic Web. It's Time to Stop Squabbling -- They're Not Incompatible." By Jim Hendler and Bijan Parsia (University of Maryland). In XML Journal Volume 03, Issue 10 (October 2002). "... Every Web link is an often vague, usually ambiguous, and almost always underspecified assertion about the things it connects. RDF lets us eliminate that vagueness and nail down the ambiguity. RDF Schema (RDFS) and the new Web Ontology Language (OWL) allow us to model the meaning of our assertion links in precise, machine-processible ways. RDFS is a simple language for creating Web-based "controlled vocabularies" and taxonomies. The language defines several RDF predicates for making links between concepts. Most notably, RDFS defines class and property relations and provides a mechanism for restricting the domains and ranges of the properties. In RDFS, for example, we can express site-specific Yahoo-like Web site categories as a hierarchy of classes with sets of named (sometimes inherited) properties. This allows other sites to connect their own information to these terms, providing better interoperability for use in B2C, B2B, or other Web transactions... The Semantic Web is being built on models based on the RDF representation of Web links. To achieve their full impact, however, the enhanced models enabled by the Semantic Web crucially need to be tied to the document-processing and data-exchange capabilities enabled by the spread of XML technologies. If XML- and RDF-based technologies were incompatible, as some people seem to think they are, it would be a true shame. But, in fact, they aren't. While the underlying models are somewhat different, the normative document exchange format for RDF, RDFS, and OWL is XML. Thus, to those preferring to think of the whole world as XML based, RDF, RDFS, and OWL may simply be thought of as yet another XML language to be managed and manipulated using the standard toolkit. To the RDF purist, the documents and datasets being expressed in XML and XML Schema can anchor their models with interoperable data. To those focused on the world of Web services, SOAP and WSDL can carry, in their XML content, RDF models expressing information that can be easily found, linked, and discovered. Of course, as is the case with any groups doing overlapping tasks, there is friction between some in the RDF community and some in the XML world. RDF folks often complain about the (for them) superfluous intricacies of XML. XML experts shake their heads at the way the RDF/XML serialization abuses QNames and XML Namespaces and treats certain attributes and child elements as equivalent. However, these kinds of complaints are nothing new. In fact, they're common in the XML community itself: witness the fury that some XML people express over XSLT's use of QNames as attribute content (to pick one example). Similarly, the RDF world has plenty of dark and overcomplicated corners. Both sets of languages are also continuing to evolve, and each is also exploring new non-XML syntaxes (consider Relax-NG, XQuery, and XPath)... The Semantic Web offers powerful new possibilities and a revolution in function. These capabilities will arrive sooner if we stop squabbling and realize that the rift between XML- and the RDF-based languages is now down to the minor sorts of technical differences easily ironed out in the standards process or kludged by designing interoperable tools..." See: "XML and 'The Semantic Web'." [alt URL]

  • [October 16, 2002] "Managing Juniper Networks' Routers Using XML. An XNM-Based RPC Mechanism." By Mark Ethan Trostler (Juniper Networks). In XML Journal Volume 03, Issue 10 (October 2002). With 6 figures. "XML is the ideal platform for network management. XML's self-describing, text-based syntax is able to encapsulate the complex, hierarchical, and volatile configuration of any networked device. Controlling a network full of devices with proprietary and constantly evolving configuration syntax requires a Herculean effort. XML and XML Schemas help by providing a canonical representation and description of each device's configuration data and model. Even better, the vast and growing array of XML tools makes it even simpler and more intuitive to work with any XML document. To leverage these inherent strengths, Juniper Networks implements an XML-based RPC mechanism with which all of Juniper's M- and T-series routers can be queried, configured, and managed. An idea this powerful deserves its own acronym: XNM (XML Network Management)... JUNOS and XNM: The XNM server software contains several features that make using the XNM API both flexible and robust. Using a 'commit-based' configuration model, a client writes configuration data to a 'candidate' configuration. A client configuring a router works on a shared or private candidate configuration. Once the configuring is done, a 'commit' operation must be issued to transfer the candidate configuration to the router's running configuration. By default, the candidate configuration is shared with anyone who happens to be configuring the router at any given instant... After our quick tour of the raw XNM API and protocol we never need to see it again. As there are many XML tools for many programming languages, as well as standalone applications, we can (and should) hide all these details behind a nice, clean, user-friendly API. Juniper has created an open-source, object-oriented Perl client. Several examples are included with the Perl client for doing basic operations like getting and setting configuration, as well as the Looking Glass Web application described below. Examples of canonicalizing namespaces are also provided. This client is meant as a library for programmers to use... Using the Perl client and the XSLT language, we provide a pretty Web interface to our router's most common operational parameters using surprisingly little code. Other, higher-level tools could also enhance the raw XNM protocol to provide just about any functionality to a single or large set of routers, from simple query-based tools to a full-blown NMS..." [alt URL] See also the Juniper Networks white paper "XML-based Network Management."

  • [October 16, 2002] "IBM Standardizing Workflow Capabilities in WebSphere." By John Fontana. In Network World (October 16, 2002). "IBM Wednesday said its WebSphere Application Server 5.0 would include a new Web services workflow engine that is a precursor to support of a proposed workflow standard it is developing with Microsoft... In WebSphere 5.0, which IBM will announce next month and ship in December, the company is adding its first true Web services workflow engine, according to company officials. IBM is doing the most work around its workflow definition language, which is needed to describe a set of processes that make up a workflow. The previous workflow engine in WebSphere Enterprise Edition 4.1 supported a proprietary language protocol. The new engine will be based on IBM's own Web Services Flow Language (WSFL) and will eventually support the Business Process Execution Language for Web Services (BPEL4WS). IBM and Microsoft are working on developing the protocol and hope to have it approved as a standard. IBM will add support for BPEL4WS as part of a service pack after WebSphere 5.0 ships. Microsoft last week said it would ship support for BPEL4WS in the first half of next year as part of its Jupiter project, which is a combination of the 2002 versions of BizTalk, Commerce and Content Management servers. Microsoft and IBM are trying to create business process automation and integration technology for their Web services platforms. Experts say business process automation is the next major area of maturity for Web services, a set of XML-based components and applications that can be integrated internally or with business partners. IBM says it will integrate its workflow engine with the transaction monitor in WebSphere for better exception handling when something goes wrong during a business process. IBM also has added logic on how to handle the rollback of a workflow and support for human intervention in the workflow process..."

  • [October 16, 2002] "IBM to Flood Websphere 5.0 With Workflow." By Carolyn A. April. In InfoWorld (October 16, 2002). "Tightening ties between application development and integration technologies, IBM on Wednesday detailed plans to significantly bolster the workflow capabilities in the upcoming version of WebSphere Application Server Version 5.0... IBM is highlighting the upcoming workflow features in the application server. The more robust workflow engine will leverage WebSphere Studio's integrated toolset in a way that IBM officials say sets Big Blue apart from rivals BEA Systems, Microsoft, and pure-play integration specialists. Central to the workflow engine is the runtime and tools support for building both J2EE and Web services-based workflow applications, according to Stefan Van Overtveldt, program director of WebSphere technical marketing at IBM. Using workflow to develop Web services plays into the strategy for making integration and Web services technologies key components of the core development environment, said Van Overtveldt. 'The workflow engine is aimed at developers building new application logic to be linked together and exposed as a Web service.' Of particular note, IBM will be taking its new workflow engine to market in November minus support for BPEL4WS, the proposed business process flow language specification it has been pushing along with Microsoft and BEA. Instead the upcoming workflow engine will be fueled by a proprietary flow definition language that he called a 'precursor' to the BPEL4WS that is technologically similar. The spec will not be ready in time, explained Van Overtveldt. A service pack for BPEL4WS will be distributed to WebSphere Application Server 5.0 users when the standard is fully cooked, he said. Building workflow and business process integration into the development environment is a trend being driven in part by the Web services phenomenon, said David McCoy, an industry analyst at Gartner, based in Atlanta... The reliance on flow composition as a way to package Web services up into a new application is a concept that Gartner has dubbed SODA -- or services-oriented development of applications, said McCoy."

  • [October 16, 2002] "Web Services Get Down and Dirty." By [ComputerWire Staff]. In The Register (October 16, 2002). "Systems management will likely be the final area where a major XML web services standard is published, with specifications expected from vendors and standards bodies early next year, writes Gavin Clarke. IBM web services guru Bob Sutor told ComputerWire a specification featuring the WS- moniker would soon appear tackling areas such as systems diagnostics, monitoring and repair. Sutor said this would likely close a period of intense recent activity. During that intense period, IBM has worked with Redmond, Washington-based Microsoft Corp and a revolving door of affiliates to develop different XML specifications and protocols, such as WS-Security and Simple Object Access Protocol (SOAP)... Systems management, so far overlooked, will likely be the last specification. 'Next year you will see some things come from us and somewhere in standards organizations. Our folks in Tivoli are interested in this and some folks in Oasis are interested in this,' Sutor said. He was unable to cite any specific examples of activity taking place inside either IBM's Tivoli unit or OASIS (the Organization for the Advancement of Structured Information Standards). Instead, he said things were still at a 'talking stage'. Sutor said it is possible to standardize basic systems management functionality - such as dials and session information - as an XML-based standard, and thus a web service. This would mean users and vendors do not have to needlessly re-invent basic functionality in future systems management software products or services. He said specifications for systems management would be valuable to IBM's strategies of autonomic computing - where systems would adapt to changing conditions and even heal themselves - and grid computing, of systems sharing networked computing resources such as processing for specific tasks..."

  • [October 16, 2002] "IBM Working on Advanced Data Integration, Grid Computing." By Paul Krill. In InfoWorld (October 16, 2002). "IBM researchers are working on projects to improve heterogeneous data integration and to enable grid computing. One endeavor, known as Project Clio, involves providing a mapping tool to look at data schemas and collections and deduce what transformations need to be made to integrate data sets, according to Laura Haas, IBM distinguished engineer for information integration and DB query processing, in San Jose, Calif... Project Clio technology is database-blind and generates SQL, so users can use any SQL system for data transformation, Haas said. The technology, which may appear in products next year or in 2004, functions with IBM's federated database technology for extracting data from various sources. The project will enable a better understanding of relationships between data in various databases, said Haas. The Clio architecture features a GUI presenting schema and data views; a correspondence engine for managing schemas, mapping and schema engines; an integration knowledgebase; and an underlying database. An offshoot of Project Clio, code-named chocolate, is intended to enable mapping of XML documents to other XML documents. 'One of the problems that's emerging is everybody's publishing in XML but we don't have standards for what a customer looks like in XML,' Haas said. Chocolate in this instance would devise a common format for defining a customer, she said. IBM anticipates there will be different ways of packaging the technology and views it as a tool for federated data sources, Haas said. XML documents can be mapped from the way they are originally structured to the way they are to be presented to the user, said Haas. Pieces of the chocolate technology may appear in the DB2 database in the coming year..."

  • [October 15, 2002] "Asynchronous Operations and Web Services, Part 3: Add Business Semantics to Web Services. Three new specifications open a world of e-business possibilities." By Holt Adams (Engagement Manager and Solutions Architect, IBM jStart). From IBM developerWorks, Web services. October 2002. ['In previous installments of this series, Holt Adams explained the relevance of asynchronous operations for Web services and saw some patterns for building asynchronous services. Now he tackles three new specifications -- Business Process Execution Language for Web Services, Web Services Coordination, and Web Services Transaction -- and explains how they open up a world of possibilities for Web services developers. You'll see how these three specifications can support asynchronous operations and create an operational programming environment that mirrors real-life business interactions.'] "The development of the Business Process Execution Language for Web Services, Web Services Coordination, and Web Services Transaction specifications is a critical step toward enabling enterprises to leverage Web services in modeling their real-world processes. With BPEL, you can describe a long-running, multistep process of a business transaction using activities to encompass atomic transactions for individual process steps. Once a process is described using the BPEL XML flow language, a business process engine must interpret the process description in order to execute and manage the business transaction. The business process engine will provide the infrastructure services and resources for enabling long-running stateful business transactions that require interactions between the process, internal services, and external partner services (which themselves could be individual services exposed by the partners' process flows). However, at this time, before enterprise processes can be fully realized in an operational programming environment, IBM and other vendors will need to update their business process engines to interpret the BPEL XML flow language and to support the coordination and transactional facilities outlined in the WS-C and WS-T specifications. Likewise, these three critical specifications will need to be moved into a standards organization and approved as open standards. Currently, IBM, Microsoft, and BEA are working towards that goal. Once business process engines generally support BPEL, WS-C, and WS-T, the IT staff of any enterprise that seeks to automate steps in its business processes by leveraging Web services technology and the new specifications will need to: (1) Describe the enterprise's processes using the BPEL XML flow language; (2) Provide the business logic for the activities; (3) Provide the Web services interfaces for legacy applications -- for example, transaction resource managers -- that will be utilized in the process flow activities for realizing the business transaction; (4) Inform the business process engine how to locate the Web services utilized within the process flow activities..." See: "Business Process Execution Language for Web Services (BPEL4WS)."

  • [October 15, 2002] "Expedia Books Itself a Hotel Deal." By Troy Wolverton. In CNET News.com (October 15, 2002). "Hotels could soon find it easier to sell rooms online, thanks to an acquisition announced on Tuesday by online travel company Expedia. Expedia is purchasing Newtrade Technologies, a Montreal-based software development company. Newtrade is developing an XML-based system that will allow hotels to send information about their room availability and pricing to various distribution networks via the Internet. Expedia and Newtrade plan to introduce the new technology early next year, the companies said... Part of the problem hotels have faced is that unlike airfares, which are largely distributed online, lodging information has generally been distributed via fax to companies such as Expedia or to global distribution systems, said Jared Blank, an online travel analyst for Jupiter Research. It's been done that way for a long time, and the distribution companies have had little incentive to change, he said. Newtrade's system may not be an immediate incentive for distributors to change their ways, but it could help Expedia itself get access to a wider selection of rooms and rates, Blank said Newtrade's technology will work with other companies besides just Expedia, Charlier said. Because it is based on XML (Extensible Markup Language), Newtrade's software will be able to connect the various systems used by hotels to manage their room inventory with the systems used by the various distribution networks, she said..." ['Newtrade has developed and implemented XML interfaces that conform to OTA specifications, ensuring seamless connectivity between hotel legacy systems and the multiple points of distribution. Using Web services, their solutions connect to a travel supplier's inventory management system, provide yield and rate management capabilities and handle software, hardware and networking issues to lower infrastructure and operating costs.'] See details in the announcement: "Expedia, Inc. to Acquire Newtrade Technologies Inc. Acquisition will fuel Newtrade's development of innovative open-standard technology for electronic distribution of hotel rates and availability." On OTA, see "OpenTravel Alliance (OTA)."

  • [October 15, 2002] "XML Canonicalization, Part 2." By Bilal Siddiqui. From XML.com (October 09, 2002). ['The second part of Bilal Siddiqui's series on XML Canonicalization. Canonicalization, the process of transforming an XML document into a known standard representation, plays a vital role in the use of XML for secure and verifiable transactions. Bilal covers the grittier parts of the specification and he discusses the canonicalization of document fragments.'] "In the previous installment of this article, I introduced Canonical XML and discussed when and why you need to canonicalize an XML file. I also demonstrated a step-by-step process that results in the canonical form of an XML document. In this second and final installment, I'll take the concept further and explain the canonicalization requirements of CDATA sections, processing instructions, comments, external entity references and XML document subsets... Exclusive canonicalization applies only while canonicalizing fragments of XML files and differs from (inclusive) canonical XML in the following two ways: (1) Attributes from the xml namespace are not imported from ancestors into orphan nodes; (2) Omitted namespace declarations are included in exclusive canonical form to an element only if: [i] The namespace declaration is used by the element or any or its child attributes, [ii] The namespace declaration is not already in effect in the exclusive canonical form..."

  • [October 15, 2002] "Introducing [DOM] Mutation Events [for SVG]." By Antoine Quint. From XML.com (October 09, 2002). ['Antoine Quint treats us to another of his "Sacre SVG" columns this month; he discusses events that can be fired in the Document Object Model when some part of it changes -- mutation events. He shows how these can be use to connect the DOM and objects in your own scripts.'] "[The article should give you] a better understanding of what DOM mutation events are and how they can help you in designing SVG extensions that behave much like other SVG elements. You'll also seen that all is not perfect: there are limitations to using mutation events with SMIL and CSS. But all in all mutation events can be very useful for component design... but there is a major drawback to using mutation events. In short, no shipped version of the Adobe SVG Viewer actually implements mutation events. It is a sad state of affairs to have such an amazing feature not available, but the good news is that Adobe has recently called for feedback about which missing features are most wanted by the community. If you care about mutation events, make yourself heard by voting for mutation events in an upcoming release of the viewer at the SVG-Developers YahooGroups poll. But Adobe's SVG Viewer is not the only game in town. Apache Batik implements mutation events, though it doesn't implement some features necessary for the TextWrap class..." See: "W3C Scalable Vector Graphics (SVG)."

  • [October 15, 2002] "Printing from XML: An Introduction to XSL-FO." By Dave Pawson. From XML.com (October 09, 2002). ['Dave gives a straightforward primer for those wishing to start experimenting with XSL-FO to get printed pages from XML data.'] "One of the issues many users face when introduced to the production of print from XML is that of page layout. Without having the page layout right, its unlikely that much progress will be made. By way of introducing the W3C XSL Formatting Objects recommendation, I want to present a simplified approach that will enable a new user to gain a foothold with page layout. The aim of this article is to produce that first page of output -- call it the 'Hello World' program -- with enough information to allow a user to move on to more useful things. I'll introduce the most straightforward of page layouts for XSL-FO, using as few of the elements needed as I can to obtain reasonable output. One of the problems is that, unlike the production of an HTML document from an XML source using XSLT, the processing of the children of the root elements is not a simple xsl:apply-templates from within a root element. Much more initial output is required in order to enable the formatter to generate the pages... processing is a two stage process at its simplest: give your source document and the above XSLT stylesheet to an XSLT processor, and the output should be a valid XSL-FO document. This can then be fed to an XSL-FO engine -- RenderX or Antenna House (both commercial, with trial options) or to PassiveTeX or FOP (non-commercial offerings)..." See the author's book XSL-FO: Making XML Look Good in Print, published by O'Reilly in August 2002. For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [October 15, 2002] "Web Services Architecture Requirements." Revised W3C Working Draft 11-October-2002. Edited by Daniel Austin (W. W. Grainger, Inc.), Abbie Barbir (Nortel Networks, Inc.), Christopher Ferris (IBM), and Sharad Garg (The Intel Corporation). Version URL: http://www.w3.org/TR/2002/WD-wsa-reqs-20021011. "This document describes a set of requirements for a standard reference architecture for Web services developed by the Web Services Architecture Working Group. These requirements are intended to guide the development of the reference architecture and provide a set of measurable constraints on Web services implementations by which conformance can be determined... The Web Services Architecture Working Group has decided to make use of two [analytical] methods concurrently, in the hope that together each of these methods will produce a well-defined set of requirements for Web Services Architecture. The two methods chosen are the Critical Success Factor (CSF) Analysis method, which will be supplemented through the use of gathering Usage Scenarios. Both of these methods are useful but represent different approaches to the problem of gathering requirements...The use of Web services on the World Wide Web is expanding rapidly as the need for application-to-application communication and interoperability grows. These services provide a standard means of communication among different software applications involved in presenting dynamic context-driven information to the user. In order to promote interoperability and extensibility among these applications, as well as to allow them to be combined in order to perform more complex operations, a standard reference architecture is needed. The Web Services Architecture Working Group at W3C is tasked with producing this reference architecture..."

  • [October 15, 2002] "IBM Recruits Friends for Storage Standard." By Stephen Shankland. In CNET News.com (October 15, 2002). "IBM, Hitachi, Sun Microsystems and Veritas will announce a new alliance Tuesday to advocate a standard to simplify storage system management. The alliance is geared to support a standard called the Common Information Model -- CIM, code-named Bluefin. The standard aims to make it easier to manage special-purpose networks that are used to wire together storage systems. Management software is critical to making such networks function, but proprietary controls have made it difficult for one company's software to administer another company's hardware. To join the alliance, members must commit to shipping CIM-compliant products in 2003, said Thomas Butler, vice president for product planning and marketing at Hitachi Data Systems. The alliance will work to ensure the standard does what it's set out to do, Butler said... But Hewlett-Packard criticized the effort, saying it prefers to work through the Storage Networking Industry Association that initially developed the CIM standard..." See the news item "Storage Vendors Announce CIM Product Rollout and Joint Interoperability Testing."

  • [October 15, 2002] "Foursome Forms CIM, Bluefin Coalition." By Evan Koblentz. In eWEEK (October 15, 2002). "The storage divisions of IBM, Hitachi Ltd., Sun Microsystems Inc. and Veritas Software Corp. today will announce collective support for the Common Information Model and Bluefin specifications. CIM is an evolving language for storage hardware and software to speak the same management language. Its details, along with the Bluefin implementation standard, are still evolving from the Storage Networking Industry Association's Storage Management Initiative. The four companies are agreeing to three criteria: to promise to ship products that are CIM- and Bluefin-compliant in 2003; to test each other's products together in SNIA plugfests; and to share their CIM provider technology, according to a joint statement to released today. Whereas the Storage Management Initiative exists for development of the actual standards, the new, unnamed coalition exists to actually implement them, according to James Staten, a spokesman for Sun, in Santa Clara, Calif... Missing from the unnamed coalition are Hewlett-Packard Co. and EMC Corp. Steve Jerman, storage management architect for HP and the original CIM author, and EMC's John Tyrell, technology architect, together co-chair SNIA's CIM committee, known as the Disk Resource Management Work Group. 'HP was invited to join but did not do so because we felt that it was totally redundant and potentially confusing, not to mention irrelevant. There already is an alliance to support CIM, and HP does not see the need for anyone to create another,' said HP spokesman Mark Stouse, in Houston..." See the news item "Storage Vendors Announce CIM Product Rollout and Joint Interoperability Testing."

  • [October 15, 2002] "Collaborative Commerce: Compelling Benefits, Significant Obstacles." NerveWire white paper. June 2002. 32 pages. Research study on the business impact and barriers of integrating the business processes and information systems of companies with their trading partners. Forwarded by Veronica Zanellato (Senior Director, Corporate Communication, NerveWire, Inc.) ['Tighter integration of business processes and information systems with customers, suppliers and business partners can dramatically increase revenue, reduce costs and cycle-times, and improve customer retention for those who overcome obstacles.'] Study Results at a Glance: "(1) 'E-business' is not dead; it's alive and well and creating measurable value, in the form of 'Collaborative Commerce'; (2) Collaborative Commerce is the use of Internet technologies to integrate a company's core business processes with those of its customers, suppliers and business partners; (3) Companies with high levels of Collaborative Commerce have cut costs and cycle times significantly, increased revenue and new product and service introductions, and achieved other major business benefits; (4) Most of the companies studied have only achieved low levels of collaboration, and have generated modest performance improvements; (5) The biggest obstacles to Collaborative Commerce include overcoming distrust and internal functional 'stovepipes' that prevent companies from being integrated internally; (6) Getting past the obstacles requires overcoming distrust and organizational stovepipes, validating the benefits with customers, creating a blueprint to guide the business process and systems integration, and devising a strategy for gaining and keeping the support of key collaboration partners..." See the associated announcement: "Collaborative Commerce Survey Reveals that High Levels of Integration with Customers, Suppliers and Business Partners Generate Significant Business Benefits."

  • [October 15, 2002] "Netegrity Secures Web Services." By Richard Karpinski. In InternetWeek (October 15, 2002). "Netegrity on Tuesday released a new security platform to help enterprises manage and secure their Web services. Netegrity's TransactionMinder, at the simplest level, helps companies control who can access a Web service and what can be done with that service. A lack of inherent security mechanisms in key XML protocols is often cited as one the key barriers to wider enterprise use of Web services. Web services, because of their highly distributed and ad hoc nature create new security challenges. In addition, a Web service often is 'pinged' not by an individual but by another Web service, something traditional access control products aren't designed to handle. Netegrity said its new platform helps meet some of those stickiest problems. For instance, built-in authentication helps determine what person or application is requesting access to the Web service, and whether or not they are a trusted user or partner. Authorization services determine the individual's rights to access the service, based on user profile or roles. And a separate auditing process oversees the whole process to provide detailed reports on the activity that has taken place with the Web service. TransactionMinder works with Web services standards such as SOAP and WSDL, and supports SAML and XML Digital Signatures for authentication. It supports an array of Web services frameworks, including Microsoft's .Net and Apache..." See: (1) the announcement "Netegrity Addresses Web Services Security With Release of TransactionMinder. Enables Companies to Use Web Services to Unlock and Integrate Mission Critical Applications For Internal Users and External Partners."; (2) "Security Assertion Markup Language (SAML)."

  • [October 15, 2002] "Plug a Swing-based Development Tool into Eclipse. How to Integrate a Swing Editor into the Eclipse Platform." By Terry Chan (Software engineer, IBM Canada Ltd). From IBM developerWorks, Open Source Projects. October 2002. ['Learn how to integrate a standalone Swing-based editor into the Eclipse Platform as a plug-in. Using simple techniques, you can share resources between the Swing tool, the Eclipse Platform, and various SWT widgets -- and these resources can communicate through mutual awareness. Tool vendors who want to bring Eclipse-based development tools to market with a minimal amount of re-coding will also find this article helpful.'] "The Eclipse Platform provides a robust set of services and APIs for tool development. It smoothes the integration between tools from disparate vendors, creating a seamless environment for different types of development work. One of the software components of the Eclipse Platform is SWT. Although not a core component set of the Platform, SWT is integral because it provides a set of Java-based GUI widgets to the product and plug-in developers. SWT is operating system independent and very portable, yet its underlying JNI interface delivers native Platform look-and-feel and performance. Overall, SWT provides a good solution for developers and tool vendors who want to write plug-ins that operate well in the Platform's various frameworks and that are visually appealing. However, SWT suffers from its rather low level of interoperability with Java's Swing GUI widgets. For instance, Swing and SWT employ completely different event handling mechanisms. This difference often makes a GUI that is a hybrid of Swing and SWT unusable... Despite all the inherent advantages offered by the Eclipse Platform, the poor interoperability between Swing and SWT makes the development effort less attractive. This article shows you how to: (1) Launch a Swing-based editor to edit any Java file in the Eclipse Platform Workbench with the name 'ThirdParty.java' (2) Bring source code changes made in the Swing editor back into the Workbench (3) Use the Preference Page framework to control the attributes of the Swing editor (4) Make the Swing editor 'Workbench-aware' (5) Launch an SWT widget from the Swing editor This article introduces some simple techniques to achieve the above, without using any unsupported APIs. No internal classes are referenced, and all general plug-in rules are adhered to. To get the most out of these techniques, you should have basic knowledge in writing plug-ins and using the Plug-in Development Environment, as well as access to the source code of the Swing-based editor... The techniques described here offer an interim solution that can help you quickly integrate Swing-based tools into the Eclipse Platform. However, whenever possible possible, you should use tightly integrated SWT/JFace components over existing Swing widgets..."

  • [October 15, 2002] "Iona Connects CORBA, Web Services." By Richard Karpinski. In InternetWeek (October 15, 2002). "Iona on Tuesday took another step in a campaign to help its customers migrate from CORBA distributed computing architectures to newer Web services-based approaches. The vendor -- a major provider of CORBA object request broker (ORB) and related technologies -- has made and equally big push into Web services. Last month, it released a new version of its XMLBus product specifically focused on CORBA-to-Web services migration. Today, it unveiled CORBA Connect, a new implementation and services program to help CORBA customers expose their back-end CORBA services via XML interfaces and to move new development work to a more lightweight Web services architecture. CORBA Connect includes a version of Iona's XMLBus product, the services of an Iona-certified technical architect and dedicated customer support. It is designed as a rapid implementation program to get customers quickly moved over to Web services protocols, Iona said. Iona said it currently has more than 4,500 customers that work with its CORBA products. Ionas XMLBus 5.4 includes tools, such as an 'operation flow designer,' to help developers map complex CORBA systems into more business-focused Web services..." See the announcement: "Iona's New Corba Connect Program Offers Seamless Corba to Web Services Migration. New Software, Services, and Training Package Enables Customers To Tap Existing CORBA Solutions with Web Services for Improved Collaboration, New Business Initiatives and Greater Operational Efficiency."

  • [October 15, 2002] "Voice Browser Call Control: CCXML Version 1.0." W3C Working Draft 11-October-2002. Edited by RJ Auburn (Voxeo). "This document describes CCXML, or the Call Control Extensible Markup Language. CCXML is designed to provide telephony call control support for VoiceXML or other dialog systems. CCXML has been designed to complement and integrate with a VoiceXML system. Because of this you will find many references to VoiceXML's capabilities and limitations. One will also find details on how VoiceXML and CCXML can be integrated. However it should be noted that the two languages are separate and are not required in an implementation of either language. For example CCXML could be integrated with a more traditional IVR system and VoiceXML or other dialog systems could be integrated with some other call control systems... This specification describes markup for designed to provide telephony call control support for VoiceXML or other dialog systems. CCXML is far from complete. This draft is meant to give people access to an early version of the language so that people can understand the direction that the working group is moving... Every executing VoiceXML program has an associated CCXML program. It runs on a thread separate from the VoiceXML dialog. When an event is delivered to a user's voice session (now a coupling of an active VoiceXML dialog and its CCXML program), it is appended to the CCXML program's queue of events. The CCXML program spends almost all its time removing the event at the queue's head, processing the event, and then removing the next event. Meanwhile, the VoiceXML dialog can interact with the user, undisturbed by the incoming flow. Most VoiceXML programs never need to consider event processing at all. Writing a CCXML program, then, mainly involves writing the handlers which are executed when certain events arrive. There are mechanisms for passing information back and forth between VoiceXML and CCXML, but the important points are that CCXML: (1) lives on its own thread, and (2) carries the burden of rapid asynchronous event handling..." See: "Voice Browser Call Control (CCXML)."

  • [October 15, 2002] "Intel, Microsoft Dip into Speech with SALT." By Thor Olavsrud. In InternetNews.com (October 14, 2002). "Aiming to help businesses extend their Web presences with speech, Intel and Microsoft Monday [2002-10-14] announced they are jointly developing technologies and a reference design based on the Speech Applications Language Tags (SALT) 1.0 specification submitted to the World Wide Web Consortium (W3C) in August. The SALT specification defines a set of lightweight tags as extensions to common Web-based programming languages, allowing developers to add speech functionality to existing Web applications... The joint effort by Intel and Microsoft will leverage Intel's telephony building blocks -- namely Intel Architecture servers, NetStructure communications boards and telephony call management interface software -- and Microsoft's .NET Speech platform to give enterprise customers a set of tools with which to build and deploy their own speech applications, and also to give ISVs, OEMs, VARs and SIs a toolset with which to build and deploy such applications for enterprise customers. Intel and Microsoft said their tools will support both telephony and multimodal applications on a range of devices. The partners believe the value proposition of such technology is clear: it stands to reduce costs associated with call center agents. A typical customer service call costs $5 to $10 to support, while an automated voice recognition system can lower that to 10 cents to 30 cents per call. Additionally, voice recognition technology can be used to give employees access to critical information while on the move..." See: (1) the announcement: "Intel and Microsoft Collaborate on Development of Speech Application Language Tags (SALT)-based Speech-Enabled Web Solutions. Initiative to Bring Microsoft and Intel Speech Software and Hardware Technologies to Enterprise and Channel Customers."; (2) the Intel overview: "Speech Application Language Tags (SALT). Creating a Speech Technology Standard for the Internet." General references in: "Speech Application Language Tags (SALT)."

  • [October 15, 2002] "Deploying SALT Telephony Call Control on an e-Business Site." By Glen Shires and Jim Trethewey (Intel). In Intel Developer Update Magazine (July 2002). 8 pages. "Speak to a Web page? Telephony call control for e-Commerce? The technology has arrived. Developers can now add these features to both new and existing e-applications. SALT (speech application language tags) lets users talk (replacing or supplementing the keyboard, mouse, or stylus) to access information online, order products, and so on. The telephony call-control features also let users make or receive calls through their computer or participate in phone conferences directly from a Web page in their browser. Call-control features are available in a JavaScript 'include' library file named 'saltcc.js'. 'saltcc.js' is royalty-free, modifiable source code available under license from Intel. To embed telephony features on your Web site, simply place this file on your Web server, and add the mark-up tags to your Web pages so that the user's PC can make the call.. This article describes only one of the many ways you can use SALT call-control capabilities to integrate innovative communications features within a Web site. A draft version of the SALT specification is available now, and the fully approved version of the specification should be available in early July 2002. Because SALT tools and software development kits are also already available, you can start experimenting now with both call control and speech I/O. By starting now, you can get a jump on business and technology plans that include the use of telephony features for online customers..." See also the article in HTML format, and the earlier article in the series: "Telephony Call Control Now Available for HTML, XHTML, and XML," by Thom Sawicki and Kamal Shah. General references: "Speech Application Language Tags (SALT)." [cache]

  • [October 14, 2002] "Are Enterprises Ready for Standardized Device Management via SyncML?" By Adam Stone. In Internet.com MComerce Times (October 14, 2002). ['The SyncML Initiative brings together big-name mobility players like Ericsson, IBM, Motorola, Nokia and Palm, among others, who joined forces in an effort to address some of the underlying technological challenges that were hindering the adoption of wireless computing.'] "Participants in the industrywide SyncML Initiative say they have good news for the many enterprise IT executives who are wrestling with device-management issues. Launched in February 2000, the SyncML Initiative brought together such big-name mobility players as Ericsson, IBM, Motorola, Nokia and Palm, among others, who joined forces in an effort to address some of the underlying technological challenges that were hindering the adoption of wireless computing. In particular, they set out to device uniform standards to govern such areas as synchronization and device management. They hit the ground running, and by December 2000 the first devices using the SyncML standards already were on the shelves. Today there are at least 80 SyncML-compliant servers and clients in the commercial marketplace. The SyncML Initiative agreed recently to consolidate its efforts with those of other standards groups under the newly created umbrella body, the Open Mobile Alliance (OMA). But the work goes on at full speed. On October 8, 2002, in fact, members of the initiative voted to send out into the world the first version of a new specification intended to set a baseline specification in the arena of device management. That specification should be ready for publication by month's end, according to Douglas Heintzman, chair of the SyncML Initiative and manager of strategy and standards for IBM's pervasive computing division..." See "The SyncML Initiative."

  • [October 14, 2002] "InterDo Protects SOAP-Based Web Services." By Dennis Fisher. In eWEEK (October 14, 2002). "Kavado Inc. on Monday unveiled a new module for its flagship product, which now is capable of protecting Web services that rely on the SOAP protocol. The company's InterDo software is designed to monitor application traffic at the business logic level and compare it against known policies for violations. In the Web services environment, it will provide an additional layer of security on top of user authentication and message encryption. InterDo will compare SOAP (Simple Object Access Protocol) messages to the standard SOAP definitions and look for discrepancies that could damage Web applications. In addition, the software will protect against a variety of known application security issues, including SQL injection, buffer overruns and application-language mismatches... Currently, most of the talk of security as it applies to Web services has centered on the need for a strong identity management system available to anyone who wants it. There has been little attention given to the security of the data and services themselves. A growing number of vendors, including Kavado and Sanctum Inc., are starting to fill this void with application-level security offerings..." See the announcement: "KaVaDo Makes SOAP & WebServices Safe With Launch of New Security Module. Emerging Web Technologies Secured by Web Application Protection Product, InterDo."

  • [October 11, 2002] "Exchanging Documents in XML Tuple Spaces." By Hans Hartman. In Seybold Report: Analyzing Publishing Technology [ISSN: 1533-9211] Volume 2, Number 13 (October 14, 2002). ['If Rogue Wave is right, what was once just an academic concept may become a mechanism for exchanging XML documents in ways that are simpler and more powerful than Web services.'] "A tuple space is a bit of shared storage -- a block of RAM, a database, a remote server -- that stores data in ordered lists called 'tuples.' The tuples have just a few key properties: (1) They can be accessed concurrently by several external applications. As with other ways of organizing data, the host system must guarantee that the tuples are completely written or removed from a space. (2) They are persistent, remaining available long after the application that created them has gone on to other tasks. (3) They do not have names or other external identifiers and are stored in no particular order. Instead, they are located by querying their content... There are already a few implementations. Tuple spaces have been developed for various programming languages, including Python, Linda and Java. Both Sun Microsystems (through JavaSpaces) and IBM (Tspaces) have developed tuple spaces for communication between Java objects... Rogue Wave Software has taken the idea of loosely coupling applications a step further by combining the notion of tuple spaces with that of self-describing XML documents... Rogue Wave's underlying philosophy is to keep the XML tuple-space system simple and flexible. Rogue Wave's product, called Ruple, has only four types of instructions: (1) Write: create a new document, specifying an access control list and an expiration date. (2) Read: issue an XQL query and read the first document that matches. (3) ReadMultiple: read all documents from the space that match an XQL query (up to a specified maximum). (4) Take: read and remove a document that matches an XQL query. Simple as they are, these four commands can support a rich assortment of document exchanges. In Ruple, XML documents are written to the space for a limited time period, allowing applications to use them on a 'leasing' basis. The company is also considering adding some type of messaging mechanism to notify partners when new documents are placed in the space. Web services have gained rapid popularity because of the pragmatic, simple and flexible ways they allow applications to link over the Internet. XML tuple spaces take the same benefits of pragmatism, simplicity and flexibility a few steps further. They enable applications to exchange XML documents asynchronously and with multiple parties, while leaving both the sending and receiving parties much flexibility in how they use these documents..." See: "Tuple Spaces and XML Spaces."

  • [October 11, 2002] "Consortium Aims to Boost Electronic Tax Filing." By Gretel Johnston. In InfoWorld (October 11, 2002). "The U.S. Department of Justice (DOJ) has given its blessing to the creation of a consortium that would work with the Internal Revenue Service (IRS) to provide free electronic tax preparation and filing services. The consortium, which still awaits the approval of the U.S. Treasury Department and the Office of Management and Budget (OMB), has not been formed, but is expected to be in place for the 2002 tax season, according to officials who have been working with the government to move forward with the public-private project. The program would help the IRS achieve its goal of receiving 80 percent of all U.S. tax returns electronically by 2007, and would assure companies that provide electronic tax preparation and filing services that the government will not get into the business of providing those services for free. The DOJ did not list the companies that will comprise the consortium, however, Scott Gulbransen, a spokesman for the tax software specialist Intuit, confirmed Thursday that Intuit will be a member... An estimated 1 million taxpayers took advantage of Intuit's Tax Freedom project when filing their 2001 returns, Gulbransen said. The 5-year-old program, which provides free tax preparation and filing services to people with adjusted gross incomes of $25,000 or less, is an example of how free services will be provided through the IRS Web site once the program is fully established, Gulbransen said..." See (1) "US Internal Revenue Service Establishes Online XML Developers' Forum for Employment Tax E-file System"; (2) "XML Markup Languages for Tax Information."

  • [October 10, 2002] "New W3C Standards for XML Will Enhance Web Service Security." By Ray Wagner and John Pescatore (Gartner Research). Gartner Research Note FT-18-4797. 08-October-2002. "On 3-October-2002, the W3C released its proposed specifications for the encryption of XML data and documents. The new standards mandate the use of XML Encryption Syntax and Processing (XML-Enc) and Decryption Transform for XML Signature (XML-DSig)... The proposed W3C standards -- building blocks for the development of future security standards -- represent an important step toward the ability to perform complex XML-based transactions securely. XML-DSig and XML-Enc make it possible to sign and encrypt elements of XML messages at a highly granular level, a key element in developing an environment in which integrated applications implement workflows across multiple enterprises via secure Web services. Ensuring the security and privacy of information passing through intermediaries in a Web service, while still allowing the intermediaries to perform work using other elements of the message or transaction, requires precise control over the elements of the transaction. XML-DSig's and XML-Enc's digital signature and encryption mechanisms provide for this control. Widespread adoption of Web Services Security (WS-Security) will further weave XML-DSig and XML-Enc into the fabric of Web services. IBM, Microsoft and VeriSign jointly developed WS-Security, and the Security Assertions Markup Language (SAML) initiative and the Liberty Alliance have accepted it as an underlying technology... Serious challenges to Web security remain, however, especially with respect to enterprises' ability to manage the public and private keys required to implement signing and encryption. The XML Key Management Specification (XKMS) offers a simplified approach to integrating public key management capabilities with applications..." Also in PDF format. See references in the news item: "W3C Specifications for XML Encryption Released as Proposed Recommendations."

  • [October 09, 2002] "Finding Your Way Through Web Service Standards: Will My Web Service Work With Your Client? Understanding the Intricacies of SOAP." By Jordi Albornoz (Software Engineer, Advanced Internet Technology, IBM). From IBM developerWorks, Web services. October 2002. ['Web services are defined by a myriad of standards. Each standard is general enough to be independent from the other standards but specific enough to only address a small piece of the Web services puzzle. The interaction between SOAP, WSDL, XML Schema, HTTP, etc. can become very complicated... This series will guide you through the prevailing Web services standards describing their specific use and explaining what it really means to support each one, how they interact, and most importantly, where compatibility problems are likely to occur. The articles will also discuss the relevant changes to come as many of these standards are being revised. In this first article in the series, Jordi Albornoz will introduce the issue of complex interaction of standards and describe some of the issues around SOAP.'] "... communicating with a Web service does not simply involve understanding WSDL, SOAP, and XML Schema. That is not enough information. Instead it involves, among other things, understanding particular extensions to WSDL, the specific transports over which to send SOAP envelopes, and the data encoding expected by the service. Just as XML is often joked about as being a standard whose sole purpose is to create more standards. The family of Web services standards seems to share that property. No doubt we will soon see higher level specifications making use of WSDL and SOAP in some specific manner. In Web services terminology, these higher level specifications would often be referred to as bindings. Moreover, each of the main specifications contains ambiguities that can stifle interoperability, thus requiring understanding of the specific interpretations of the specs. Each standard has extensibility points, optional sections, and ambiguities which contribute to divergence in functionality of implementations.... The generality of SOAP means that not all systems that make use of SOAP can communicate easily. Since SOAP is transport independent and does not enforce a data encoding format, one must be aware of more about a service than simply that it supports SOAP. The most important thing to understand is that SOAP is simply a message format. In the next article in this series, I will explain another commonly misunderstood aspect of SOAP which is responsible for many interoperability issues, namely, data encoding... one must be aware of the data encoding to properly communicate with a Web service. But that is not enough in many cases. I will attempt to clarify the purpose of different data encodings and dig deeper into the specific issues surrounding the particularly confusing SOAP Section 5 Encoding. I will also begin to discuss WSDL and its relation to SOAP and other Web services standards..." See "Simple Object Access Protocol (SOAP)."

  • [October 09, 2002] "Progress in the VoiceXML Intellectual Property Licensing Debacle." By Jonathan Eisenzopf (The Ferrum Group). From VoiceXMLPlanet.com. October 2002. "In January of 2002 the World Wide Web Consortium released a rule that requires Web standards to be issued royalty free (RF). Some VoiceXML contributors hold intellectual property related to the VoiceXML standard. Some of those companies have already issued royalty free licenses, while others have agreed to reasonable and non-discriminatory (RAND) licensing terms... The fact that not all contributors have switched to a royalty free licensing model has been a thorn in the progress of the VoiceXML standard. I've voiced my concerns previously on this issue, specifically in SALT submission to W3C could impact the future of VoiceXML... Recently, IBM and Nokia changed their licensing terms from RAND to RF. At the VoiceXML Planet Conference & Expo on September 27 [2002], Ray Ozborne, Vice President of the IBM Pervasive Computing Division assured the audience at the end of his keynote speech that IBM would be releasing all intellectual property that related to the VoiceXML and XHTML+Voice specifications royalty free and encouraged the other participants to do the same... If VoiceXML is going to survive as a Web standard, then all contributors must license their IP royalty free, otherwise, the large investment that's been made will go down the drain. My hope is that the voice browser group at the W3C will either resolve these licensing issues in the next six months or jettison VoiceXML and replace it with SALT. Either way, I believe that it would be prudent for voice gateway vendors to be working on a SALT browser so that customers have the option down the road..." Note: the W3C website indicates [2002-09-25] that the Voice Browser Working Group was rechartered as a royalty free group operating under W3C's [then] Current Patent Practice. See: (1) "VoiceXML Forum"; (2) "Speech Application Language Tags (SALT)"; (3) "Patents and Open Standards."

  • [October 09, 2002] "An Extensible Markup Language Schema for Call Processing Language (CPL)." By Xiaotao Wu, Henning Schulzrinne, and Jonathan Lennox (Department of Computer Science, Columbia University). Internet Engineering Task Force (IETF), Internet Draft. Reference: 'draft-wu-cpl-schema-00.txt'. October 7, 2002, expires March 2003. "The Call Processing Language (CPL) is a language that can be used to describe and control Internet telephony services. It is based on the Extensible Markup Language (XML), a common hierarchical format for describing structured data. This document provides an Extensible Markup Language (XML) Schema for the Call Processing Language (CPL). The original CPL specification only provides a Document Type Declaration (DTD) to describe the structure of the language. Compared with XML DTDs, XML schemas have many advantages such as performing stricter type checking, providing pre-defined data types and being able to derive new data types from existing ones... And we recommend that all future extensions of CPL should use schema definitions only..." See (1) IETF IP Telephony Working Group; (2) "Call Processing Language (CPL)." [cache]

  • [October 09, 2002] "Web Services-Based Specification For Mobile Payments Released." By Paul Krill. In InfoWorld (October 09, 2002). "Paycircle, an industry organization representing more than 30 companies including Hewlett-Packard, Oracle, and Sun Microsystems, on Wednesday released a specification for conducting mobile commerce via a Web services paradigm. Other major vendors such as BEA, IBM, and Microsoft are not members of the organization... PayCircle's specification uses Web services specifications WSDL and SOAP as well as XML to provide for payment services, according to Jacob Christfort, vice president of product development at Oracle's mobile products and services division, in Redwood Shores, Calif. In the payment scheme, WSDL describes the information going back and forth over the wire, SOAP makes the service request, and the SOAP message is formatted using XML, he said... A key intention of PayCircle's effort is to better enable 'micro-payments,' which are transactions below $10, such as sending of a stock quote, that can cost more for the payment service than the actual data service itself costs, Christfort said..." See references in the 2002-10-09 news item: "PayCircle Releases WSDL Specification Version 1.0 for Mobile Payment."

  • [October 09, 2002] "Middleware 'Dark Matter'." By Steve Vinoski (Vice President of Platform Technologies and Chief Architect , IONA Technologies). In IEEE Distributed Systems Online [ISSN: 1541-4922] (September/October 2002). "... the 'mass' of the middleware universe is much greater than the systems -- such as message-oriented middleware (MOM), enterprise application integration (EAI), and application servers based on Corba or J2EE -- that we usually think of when we speak of middleware. We tend to forget or ignore the vast numbers of systems based on other approaches. We can't see them, and we don't talk about them, but they're out there solving real-world integration problems -- and profoundly influencing the middleware space. These systems are the dark matter of the middleware universe... There are no hard and fast rules to help you decide whether to use traditional middleware or middleware dark matter to solve your integration or distributed computing problems. These decisions depend on your skill set, financial situation, the scope and schedule of your project, and the capabilities of the systems and technologies you're integrating. Middleware dark matter sometimes (but not always) lends itself to solutions that are easy to create but fail to evolve, scale, or perform as well as a traditional middleware solution. On the other hand, using traditional middleware is no guarantee that your system will evolve gracefully, or perform or scale well... XML is quickly replacing specialized formats and their associated tools in both the dark matter and traditional middleware communities, as using standard XML and its tools and practices is less expensive than developing and maintaining proprietary tools and formats. Web services might very well start to illuminate our middleware dark matter. Given that a significant portion of the existing middleware dark matter is probably implemented in Visual Basic, for instance, Microsoft's promise that .Net will put a Web services substrate beneath VB -- as well as Perl, Python, and other middleware dark matter languages -- is indeed significant. Even as vendors of traditional middleware are moving to augment their Corba, J2EE, MOM, and EAI systems with Web services, the middleware dark matter community is building modules for SOAP, the Web Services Description Language, and other technologies related to Web services (for example, a search of CPAN in August for SOAP turned up 226 hits). Web services might therefore be a genuine point of convergence for traditional middleware and middleware dark matter. If this convergence actually occurs and Web services do manage to illuminate the dark matter, there's no question that the 'laws of physics' that currently govern our middleware universe will change..."

  • [October 09, 2002] "Managing Scientific Metadata Using XML." By Ruixin Yang, Menas Kafatos, and X. Sean Wang (George Mason University). In IEEE Internet Computing Volume 6, Number 4 (July/August 2002), pages 52-59. "In this article, we present our XML-based Distributed Metadata Server (Dimes) -- which comprises a flexible metadata model, search software, and a Web-based interface -- to support multilevel metadata access, and introduce two prototype systems. Our Scientific Data and Information Super Server (SDISS), which is based on Dimes and GDS, solves accurate data-search and outdated data-link problems by integrating metadata with the data systems. On the implementation front, we combine independent components and open-source technologies into a coherent system to dramatically extend system capabilities. Obviously, our approach can be applied to other scientific communities, such as bioinformatics and space science... Several Dimes servers manage metadata and support data search and access for projects in our institution. One operational Dimes is for the Seasonal to Interannual Earth Science Information Partner (SIESIP) project. The project's XML document contains 186 nodes associated with 13 data products. The system operates on an SGI Origin 200 computer with dual CPUs. Users can automatically access data set information using different protocols. We derived all the examples given previously from this version of the prototype... We have developed two Web-based prototypes for exploring Dimes' capabilities. A Web-based Dimes client usually includes a Web interface, an XML translator, and an XML-to-HTML mapper suite. When a Web user submits a query, the client passes the query to a specific XML translator, which automatically translates the query into one or more predefined types of queries in XML format, and then sends them to the XML query engine... We use Java servlets and XSL Transformations for the translator and mapper tools. Since mapping is a dynamic process, the resultant interface also depends on the query results... The flexibility we achieve by using XML without a predefined standard simplifies system porting across scientific disciplines. Dimes is an enabling system, rather than a particular solution system. Finally, we can integrate it with other data scientific servers using the same overall super server concept..." See following entry and R. Yang et al., "An XML-Based Distributed Metadata Server (Dimes) Supporting Earth Science Metadata," Proceedings of the 13th International Conference on Scientific and Statistical Database Management, 2001, pages 251-256.

  • [October 09, 2002] "A Web-Based Scientific Data and Information Super Server With Flexible XML Metadata Support." By Ruixin Yang, X. Sean Wang, Yixiang Nie, Yujie Zhao, and Menas Kafatos (Center for Earth Observing and Space Research - CEOSR, George Mason University). Extended abstract prepared for 'The 19th International Conference on Interactive Information Processing Systems (IIPS) for Meteorology, Oceanography, and Hydrology', 83rd Annual Meeting of the American Meteorological Society (9-13 February 2003, Long Beach Convention Center, Long Beach, California). "In the information age and with explosively increasing volumes of remote sensing, model and other Earth Science data available, scientists, including meteorologists, are now facing challenges to find and to access interesting data sets effectively and efficiently through the Internet. Metadata have been recognized as a key technology to ease the search and retrieval of Earth science, NWP and other meteorological data. In this paper, we first discuss the DIstributed MEtadata Server (DIMES) prototype system. Designed to be flexible yet simple, DIMES uses XML to represent, store, retrieve and interoperate metadata in a distributed environment. DIMES accepts metadata in any well-formed XML format and thus assumes the 'tree' semantics of metadata entries. In addition to regular metadata search, DIMES provides a web-based metadata navigation interface by using the 'nearest neighbor search.' We also discuss a system designed to integrate an existing data access and analysis server, namely the GrADS/DODS server, and the DIMES to form a Scientific Data Information Super Server (SDISS), which supports both metadata and data. The SDISS guarantees the consistency between the content of the data server and the content of the metadata server..." See also preceding bibliographic entry.

  • [October 09, 2002] "A Global Universal Description, Discovery, and Integration Repository Is Unlikely Part of Web Services in Short Term, Says IDC." By [IDC Staff]. Summary from IDC Bulletin #27838 "UDDI in the eMarketplace: The Right Role for the Repository?," by Rob Hailstone (August 2002). "The concept of a global universal, description, discovery, and integration (UDDI) repository has received much criticism as a standard component of Web services architecture due to the lack of business demand for such an implementation. According to IDC, the need for a UDDI repository is apparent at the corporate level, but on a global scale it is difficult to envision how a repository could be policed such that it serves as a benefit, rather than hazard, to business. 'The secret lies in the repository being operated by a trusted third party, and the most likely type of organization to fill this role will be the emarketplace,' said Rob Hailstone, director of IDC's Systems Infrastructure Software research. 'IDC believes that repositioning emarketplaces around Web services and featuring the UDDI repository would be viable within the next two to three years. However, initial adoption rates are likely to be low until demand is driven by leading user organizations.' According to IDC, the second phase of Web services deployment, where usage is predominantly between organizations, has to be successful if the more visionary later phases are to have any chance of becoming reality. UDDI will play an important role enabling this second phase, but implementation must be in a controlled manner if the expected benefits are to be achieved. IDC believes that UDDI deployment within the infrastructure of emarketplaces provides the level of control required within a format that creates business benefits for service providers and consumers as well as the operators of emarketplaces..." See: "Universal Description, Discovery, and Integration (UDDI)."

  • [October 09, 2002] "Working XML: Use Eclipse to Build a User Interface for XM. Making Improvements Based on Experience and User Feedback." By Benoît Marchal (Consultant, Pineapplesoft). From IBM developerWorks, XML zone. October 2002. ['Anyone familiar with XM -- the low-cost, open-source content management solution based on XSLT -- knows that for all its good points, it still lacks a decent user interface. In this article, columnist Benont Marchal uses the Eclipse platform's open universal framework to build a user interface for XM.'] "After wrapping up my discussion of XI in my previous column, I decided it was time to revisit XM, the simple XML content management and publishing solution with which I kicked off this column. In the last year, I have been using XM for various projects and I have heard from readers as well. Throughout the year, I collected suggestions on what should be improved. In this article, I will show you how to create a user interface for XM. I chose to use Eclipse, an open-source project, to build an adaptable programming environment. I believe this article provides useful information for any would-be Eclipse developers, even if they have no interest in XML... In a nutshell, Eclipse is a modern version of Emacs, the popular programmer's editor. Emacs is popular because it has been customized for just about any programming language on earth. However, with all due respect, the Emacs user interface is a relic from the past that never adapted to GUI. Eclipse aims to be as flexible as Emacs while offering a more modern user interface... In addition, Eclipse is more than an editor. This highly modular and extensible IDE integrates all the tools developers need. It offers project wizards, an integrated compiler, a debugger, and more. The standard distribution ships with Java tailoring, but other languages such as C/C++ and even COBOL are supported through plug-ins, which guarantee portability. In fact, Eclipse IDE uses plug-ins -- including the Java editor, compiler, project wizards, and class browsers -- to create most of what you see. I could easily write similar plug-ins for XM to put together an XM IDE... Eclipse is the most ambitious editor API I have ever used. If you have used customized programmer editors before, the breadth of the Eclipse API may surprise you... I believe that Eclipse offers a promising framework on which to build the UI. It suffers from a lack of documentation, so learning took more time than I would have liked. In upcoming articles, I will write more useful plug-ins for XM..."

  • [October 09, 2002] "ITxpo: Ballmer Unveils Plans For New Office App." By Ed Scannell. In InfoWorld (October 09, 2002). "During his keynote address on Wednesday at the Gartner Symposium/ITxpo conference here, Microsoft CEO Steve Ballmer unveiled plans for developing a new desktop application, code-named XDocs, that promises to streamline the cumbersome process of gathering and distributing data from multiple sources. Scheduled for delivery in mid-2003 and expected to be a member of the Office family, XDocs can be integrated with a large number of business processes because it can support user-defined XML schemas that can be integrated with XML Web services. By so doing, Microsoft hopes to provide users with another way to reuse corporate data and improve information flow. 'The big breakthrough for Office lies in XML,' Ballmer said, adding that XML enables a slew of new features and capabilities, such as collaboration, better management, the ability for users to find the information that they want, and new Office application categories. With users stockpiling enormous amounts of new data that is strung out over a number of islands from desktops to large back-end servers and several places in between, it is getting more difficult for users to even locate -- never mind more effectively use -- that data competitively, Microsoft officials contend..." See references in the news item "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas."

  • [October 09, 2002] "Microsoft Adds XDocs to Office Family." By Peter Galli and Mary Jo Foley. In eWEEK (October 09, 2002). "Microsoft officials are promoting XDocs as a smart client like Office. 'Think of it as a hybrid information gathering tool for organizations that blends the benefits and richness of a traditional word processing program with the data capturing ability and rigor of a forms package into the XDocs templates,' Scott Bishop, an Office product manager, told eWEEK. There are two components to XDocs: the designer, which allows users to create templates and schemas; and the editor, which allows information to be inputted and viewed. The idea is to provide users with a set of 25 templates based on industry-standard XML. That will enable developers, third parties, corporate IT programmers and technical users to create additional templates, based on the XML schema they define, specific to their business or industry. Information will then be entered onto these templates. 'That allows customers to decide, through their own schema, what that data should look like. And because it's XML, we can then parse that data out of the document and send it to any XML-enabled back-end system from where it can also then be retrieved. It thus complements customers' existing infrastructures,' Bishop said. However, he declined to say how XDocs would fit with the existing suite of Office products or the next-version Office 11. He also would not confirm whether it will debut as a stand-alone product, as part of the Office suite or simply as a technological component of Office 11... The XDocs team is headed up by Peter Pathe, a corporate vice president who reports directly to Raikes, who is spearheading Microsoft's 'Structured Document Services' work. Microsoft has declined to comment on exactly what the Structured Document Services is. But according to the Microsoft corporate Web site, the SDS team 'is responsible for developing new products for knowledge workers that build on the foundations of the industry standard Extensible Markup Language (XML)'..." See: (1) references in the news item "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas"; (2) "Microsoft Office 11 and XDocs."

  • [October 08, 2002] "eXcelon Stylus Studio 4.0: A Now-Mature XSLT Editor." Reviewed by Kurt Cagle. In New Architect Volume 7, Issue 11 (November 2002), page 46. "Today, eXcelon Stylus Studio has gone from being a promising concept to a superb, mature editor for XSLT. It has also extended its capability to other languages, including XML Schema Definition language (XSD) and Java servlets... Stylus works primarily as a text editor for XSLT. In addition to savvy tag completion, the editor also includes intelligent support for XHTML, XSLT, and XSD that shows the valid tags available within each element. The editor also recognizes the XSL Formatting Objects (XSL FO) namespace, which is beginning to make inroads in printed page formatting. The Stylus editor works natively with the Xalan-Java XSLT processor, which is relatively fast and currently very standards compliant (and includes a number of useful extensions of its own). However, it can also work with Microsoft's MSXML3 and MSXML4 processors, and it lets you substitute your own XSL Processor for the resident processor. Writing XSLT can be a hair-pulling operation at the best of times, and a critical part of any decent XSLT editor is the debugger. Stylus Studio includes a fair one, though I found its interface slightly kludgy compared to the competition. However, with Stylus Studio I could readily determine the errors, see the contents of currently defined variables and attributes, step over or through XSLT code one instruction at a time, and even watch in an output pane as a specific command altered the output... eXcelon Stylus Studio is definitely a useful tool for the XML/XSLT developer. It gives you remarkably powerful features for working with XSLT parameters, named or matched templates, different output formats, and more. Stylus Studio is reasonably priced, and it works well both as a standalone application for developing XSLT and in conjunction with other workflow products, including eXcelon's XML management system..."

  • [October 08, 2002] "Can Public Web Services Work?" By Amit Asaravala. In New Architect Volume 7, Issue 11 (November 2002), pages 34-38. ['Early adopters of Web services could win big -- or they might be left holding the bag when the services craze blows over.'] "... At a time when fears are high, SOAP's lack of security standards is a significant problem. When writing standard SOAP clients, developers can route all SOAP calls through a single SOAP object, and use the same XML parser when receiving SOAP messages in return. But when additional nonstandard security procedures are required, developers must write extra handling code for each client. It should be noted, though, that SOAP also gets a bad rap where security is concerned. It's true that SOAP traffic can pass through firewalls on port 80 along with other more benign HTTP requests. Critics claim that because port 80 is usually left open when organizations host Web sites or want Web access, firewalls aren't optimized to catch incoming SOAP messages. In truth, administrators have more control than this. One easy security measure is to customize filters to catch all messages with a MIME content type of text/xml-soap. They can also force clients to submit M-POST requests rather than using the traditional POST method. With M-POST requests, one or more fields in the message header are mandatory. Firewall admins can require incoming SOAP headers to state the action they wish to perform, and then filter messages based on this information. (Savvy admins will also verify that the action in the SOAP body matches that in the header.) One problem that Web services can't get around, however, is latency. Although low-bandwidth connections, congested routers, and overloaded servers have always been the nemesis of distributed computing projects, Web services add still another bottleneck: the SOAP parser. Because a SOAP message is an XML document, any client or server that receives one must build a document object tree in memory before acting on it. Anyone who accesses a Web site will have to wait... Web services will survive because the concept has already won the popularity contest. There is a need for distributed applications; Web services are built on simple protocols; and many Web developers find SOAP familiar and easy to use. All these reasons spur the adoption of this new technology..."

  • [October 08, 2002] "Management: XML Goes to Washington." By Gary H. Anthes. In Computer World (October 07, 2002). ['The feds are tapping XML for interoperability, public access and record-keeping.'] "Like private industry, the federal government is beginning to broadly embrace XML, the open standard for exchanging information among disparate computer systems. Government IT managers say the deployment of XML will facilitate interoperability among federal systems and between the government and corporations. They say it will also ultimately simplify government record-keeping and reduce the waste, fraud and abuse that results from poor records management practices. But meeting such goals isn't without challenges, including a proliferation of nonstandard data definitions and structures, a lack of a cohesive federal strategy for XML adoption and concerns about security. The private sector faces similar challenges in its deployment of XML, so how the government proceeds could provide some guidelines for the corporate world. Nevertheless, a number of initiatives aimed at bringing the government the benefits of XML are under way in Washington... Marion Royal, co-chairman of the XML Working Group and an agency expert at the General Services Administration, says he's working with industry groups for banking, insurance, automotive and the like to avoid 'reinventing the wheel.' For example, he says, the Department of Energy may be able to use XML data structures, schemas and terminology developed by the oil industry. Royal says the governmentwide XML registry will give the private sector 'a single schema to use to do business with all government agencies, regardless of whether it's business reporting, actual sales transactions or information discovery.' That will result in standard data structures and definitions and possibly reusable software components, which are lacking today. The XML group is also working with the Organization for the Advancement of Structured Information Standards on e-Business XML (ebXML) technical specifications. EbXML allows trading partners to discover one another and conduct business over the Internet. 'EbXML holds a lot of promise,' Royal says... Royal says vertical industry groups are addressing industry-specific needs for their XML-based standards but that not enough is being done to address 'horizontal' functions, such as user authentication. He blames this in part on IT vendors that fear losing a competitive advantage based on their proprietary approaches to security..." See XML.gov.

  • [October 08, 2002] "Microsoft Gives Sneak Peek at Jupiter, Greenwich Technologies." By Cathleen Moore. In InfoWorld (October 08, 2002). "Sketching the rough details of its vision for a connected business environment around Web services, Microsoft revealed a new e-business server project, code-name Jupiter, and gave more details around its vision for real-time computing in the enterprise... Reduced complexity and bolstered connectivity will come from Jupiter, a set of technologies designed to componentize and unify Microsoft's e-business server line of products including BizTalk Server, Commerce Server, and Content Management Server, according to David Kiker, general manager of e-business servers, who also participated in the keynote address. Essentially, Jupiter will offer business process management capabilities and Web services technology such as the BPEL4WS (Business Process Execution Language for Web Services) to allow the servers to work together as a unified platform. Jupiter also will provide tighter integration with Visual Studio .Net and Office in order to improve the experience for developers and information workers. For example, through direct connectivity to Microsoft Office, Jupiter could allow a business user to monitor and manage a real-time business process from inside an Office application such as Word or Excel, Kiker said. Jupiter will be delivered in two phases over the next year to 18 months. The first set of technologies is scheduled to be introduced in the second half of 2003, with the focus initially on services for business process management, XML Web services, and internal and external integration. Specific offerings were not detailed, but will include process automation, workflow, integration technology, support for BPEL4WS, and an integrated developer environment, according to Kiker..." See details in the announcement: "Microsoft Details Vision for the Connected Business and Unveils 'Jupiter'. The 'Jupiter' Vision Aims to Unify and Extend Current E-Business Server Technologies And Include Standardized Business Process Management Capabilities, Deeper Support For XML Web Services, and Richer Developer and Information Worker Experiences."

  • [October 08, 2002] "Second-Generation RosettaNet." By Richard Karpinski. In InternetWeek (October 08, 2002). "Arrow Electronics -- an early backer of RosettaNet e-business standards -- is looking to drive use of the technology even deeper into its supply chain. For Arrow, one of the largest electronics distributors in the world, supply-chain connectivity is a major challenge because the company touches so many other parts of the value chain. Arrow today already supports about 1,200 'machine-to-machine' connections via electronic data interchange (EDI). For instance, about 800 accounts get manufacturer retail price requirements through EDI-based collaborative techniques, said Paul Katz, Arrow's vice president, supply-chain solutions. Beyond that, Arrow also has been committing to RosettaNet standards for several years. That industry initiative, which defines a series of tech-industry-specific XML standards to support common supply-chain transactions, has been working hard to get broad adoption of its efforts. Some big users, perhaps most notably Cisco and Intel, have been strong backers of RosettaNet. And their efforts have the potential to drive the standards down through second-tier suppliers. Arrow, while not quite on the scale of a company like Cisco, is aiming to grow its RosettaNet transactions as well. Today, according to Katz, the vendor supports almost 20 different PIPs -- or XML partner interface processes -- with about two dozen trading partners. Katz is hoping to double that number in the next year or so. Helping in this effort is Viacore, which positions itself as a RosettaNet 'business-tone' provider, the analogy being that a supply-chain customer like Arrow can tap into Viacore's 'always-on' transaction network to move RosettaNet transactions to suppliers. Indeed, Viacore's network-based approach to the supply chain solves perhaps the biggest problem faced by company's like Arrow -- simply getting enough trading partners on-line to make a networked supply chain really pay dividends... But perhaps even more importantly, Katz hopes to move Arrow and its business partners from moving 'purely informational' PIPs to newer XML interfaces that exchange more meaningful transaction data. 'RosettaNet has the capacity to cover not just information but also commercial transactions,' said Katz. 'My forecast is that we'll begin to get to those types of capabilities.' RosettaNet holds some strong advantages over EDI. While EDI transactions move to and fro in batches -- often with a significant time delay -- RosettaNet's XML transactions move via the Internet in real-time. That gives Arrow 'a real-time view' into how its business is progressing on a minute-by-minute basis. 'With EDI, there are classic latency problems,' Katz said. 'The acknowledgement process can turn from hours into days. With RosettaNet, transactions come over in a matter of minutes, exceptions can take longer, but get resolved in hours'..." See "RosettaNet."

  • [October 08, 2002] "IBM Paves The Way For Open Standards." By [SME IT Staff]. In SME IT Guide October 2002 Issue. "IBM Singapore is setting up a $18 million centre that is aimed at helping industries, government institutions and users in Singapore move to IT operating environments based on open standards and Linux. The IBM Open Computing Centre is made of two nodes -- the IBM-Nanyang Polytechnic Web Services Innovation Zone (WIZ) and the IBM Linux Integration Zone (LIZ). To mark the launch of the IBM Computing Centre, a memorandum of understanding on the establishment of WIZ was signed by Nanyang Polytechnic's Principal and Chief Executive Officer, Lin Cheng Ton and General Manager of IBM ASEAN/South Asia, Satish Khatu. While WIZ aims to promote Open Web Services Standards such as XML, SOAP, WSDL and UDDI to allow software developers to work with open standards programming models and deploy Web services over multiple platforms, LIZ will promote the adoption of Linux by business partners, customers and developers by providing them with the platform to build, test and deploy a variety of Linux-based solutions...LIZ located at IBM's regional headquarters at Changi Business Park will not only promote the adoption of Linux-based solution but also provide training, technical support and integration services for IBM Linux-based middleware, products and solutions. Apart from the initial $18 million investment, IBM has also committed another $22 million over the next three years..."

  • [October 08, 2002] "Document Object Model (DOM) Level 3 Validation Specification Version 1.0." W3C Working Draft 08-October-2002. Edited by Ben Chang (Oracle), Joe Kesselman (IBM), and Rezaur Rahman (Intel Corporation). Latest version URL: http://www.w3.org/TR/DOM-Level-3-Val. Last Call Working Draft for review by W3C members and other interested parties. "This specification defines the Document Object Model Validation Level 3, a platform- and language-neutral interface. This module provides the guidance to programs and scripts to dynamically update the content and the structure of documents while ensuring that the document remains valid, or to ensure that the document becomes valid. ... [Validation] describes the optional DOM Level 3 Validation feature. This module provides APIs to query information about the XML document. A DOM application can use the hasFeature method of the DOMImplementation interface to determine whether a given DOM supports these capabilities or not. This module defines 1 feature string: 'VAL-DOC' for document-editing interfaces. This chapter focuses on the editing aspects used in the XML document-editing world and usage of such information..." Also in PDF format. See (1) W3C DOM Activity; (2) local references in "W3C Document Object Model (DOM)."

  • [October 08, 2002] "Swingtide Founders Jack Serfass and David Wroe: XML's Ticking Bomb." By Jack Serfass and David Wroe. In CNet News.com (October 07, 2002). [Note: XML-DEV comments from Len Bullard,, Mike Champion, Dare Obasanjo and others under the thread title "Sky is falling again."] "XML is a promising and far-reaching development in computing. Yet the mere fact we're all speaking the same language doesn't mean we're making beautiful music. Just because an XML message is "valid" (meets standards for the Extensible Markup Language) does not mean it can work with other valid XML messages, or that it will behave as expected. As a result, we're losing control of the hundreds of disparate XML vocabularies we've developed, the mountains of XML-ized data we've generated and the millions of XML-enabled software components we've created. They're all incredibly valuable in their own right, but a huge challenge to manage -- particularly at the point they connect to execute business processes. If we don't harness XML properly, billions of dollars will be spent fixing mistakes... Even though XML is a lingua franca, the price of its expedience has become complexity and inefficiency. Chief information officers are encountering layers upon layers of XML data generated by multiple platforms through myriad applications. Companies may be sitting on a goldmine of XML services and be unaware of it. They may also be sitting on potential problems..."

  • [October 08, 2002] "OAGIS 8: Practical Integration Meets XML Schema." By Mark Feblowitz (Frictionless Commerce). Originally published in XML Journal Volume 3, Issue 9 (September 2002), pages 22-28. ['Frictionless Commerce's XML Architect Mark Feblowitz played a lead role in architecting OAGIS 8, a milestone in XML-enabled integration. In this XML Journal article, he authors a comprehensive overview of OAGIS 8, which infuses an established, technology-neutral integration standard with the enhanced capabilities of XML Schema.'] "This article describes the Open Applications Group Integration Specification, discusses the enhancements made possible by rearchitecting to Schema, and explores the challenging aspects of applying current Schema technology. Despite those challenges, OAGI architects were able to work with Schema to craft a new OAGIS that sustains proven strengths and adds desirable and innovative features, most notably, Overlay Extensibility. ... Like all XML interchange, a BOD is generated by the sending application, which serializes content from database tables or business objects, conformant to the BOD's respective schema. The BOD is consumed by the receiving application; it's typically validated against its schema and checked for adherence to BOD constraints (more on this later) and then mapped into tables or business objects. The application continues its processing, likely following up with subsequent BOD interchange. BOD processing architectures vary, based on transaction loads, performance needs, application architectures, and the like. BODs can be validated and constraint-checked using either DOM- or SAX-based architectures. Validation can be disabled for performance purposes among stable, trusted interchange partners. Much of what's needed for typical business transactions is captured in core OAGIS. But needs differ; industries, organizations, companies conduct business according to their specialized vocabularies. OAGIS was designed with this in mind. OAGIS 8 supports two forms of extensibility: UserArea Extensibility and Overlay Extensibility. When core OAGIS carries most of what's needed, an OAGIS user can add content to any BOD by populating one of the optional UserArea elements. When BOD extensions are numerous, or when additional BODs and/or components are needed, the OAGIS user can build Overlay Extensions, creating new vocabularies by building on core OAGIS components and BODs. With Overlay Extensibility new layers are defined in their respective namespaces. Specialized BODs and components are defined by extending BODs from lower layers and/or by composing new BODs from a combination of existing, extended, and new components. To support this, OAGIS BODs -- and most constituent parts -- are overlay extensible. To achieve Overlay Extensibility, OAGIS relies on three Schema mechanisms: (1) Schema's incorporation of XML Namespaces; (2) complexType derivation by extension; (3) Substitution groups... The OAGIS 8 project addresses an extremely complex XML model, certainly pushing the boundaries of Schema. Complex, creative solutions -- especially those that appear simple to users of the model while hiding significant complexity faced by the developers of the model -- were often called for. To ensure Schema's future success, the Schema Working Group must continue to attend to the practical necessities of Schema use..." See "Open Applications Group."

  • [October 08, 2002] "ITxpo: Gartner Grades The Web Services Standards." By Tom Sullivan. In InfoWorld (October 08, 2002). "Gartner analyst Larry Perlstein on Tuesday issued something of a report card on the forthcoming set of core Web services standards here at Gartner's Symposium/ITxpo conference. Perlstein's grading included what form the core low-level de facto set of protocols for Web services, SOAP, WSDL, and UDDI, as well as some of the protocols that are emerging higher up the stack including BPEL4WS (Business Process Execution Language for Web Services). Perlstein also graded two of the so-called standards bodies involved in Web services, the WS-I (Web Services Interoperability Organization) being led by Microsoft, IBM, and BEA, and the Liberty Alliance, spearheaded by Sun Microsystems. Beginning with SOAP, which received a grade of Strong Positive, the highest mark Perlstein handed out, he said that its strengths include broad vendor support, broad tool support, and is relatively easy to use. On the other hand, SOAP is still a specification, and as such has lots of security holes and scalability issues... He added that the growing complexity may raise concerns in the future. But by 2007, SOAP will be used in approximately 60 percent of all new development projects. Perlstein gave WSDL, the specification for describing Web services, a Positive rating, for some of the same reasons as SOAP, most notably broad vendor support and growing tool support, as well as the fact that it is implementation-independent... UDDI received the lowest mark of the three low-level specifications, a grade of Promising, and Perlstein cited limited adoption and tools support as well as the specification not being in W3C as the reasons... Higher up the Web services stack, Perlstein gave BPEL4WS a Promising rating, and called it the 'worst-named specification.' BPEL4WS' strengths include the combination of technologies it inherited when subsuming Microsoft's XLang and IBM's WSFL (Web Services Flow Language) and the fact that it standardizes process logic and interaction. In 2005, Gartner predicts that companies will be dealing with multiple implementations. Perlstein also graded the Liberty Alliance and WS-I, giving a Promising and Positive mark, respectively. He said Liberty's strengths are Sun's leadership, the open alternative to Passport, and the idea that multiple IDs increase anonymity. Its weaknesses are that it competes with Passport, is still a specification, and that it has not incorporated with WS-I. WS-I's strengths are a solid charter and deliverables, as well as the vendors' ability to speed adoption of the specification..."

  • [October 08, 2002] "Shootout At The Transaction Corral: BTP Versus WS-T." By Roger Sessions (Austin, Texas). In ObjectWatch Newsletter Number 41 (October 3, 2002). "... The joint IBM, BEA, Microsoft standard is called WS-T, which stands for Web Service-Transactions. The other standard is an effort of the Business Transactions Technical Committee of OASIS (Organization for the Advancement of Structured Information Standards). The contributors to this standard are Sun, Oracle, BEA, HP, and assorted riff-raff. This standard is known as BTP, which stands for Business Transaction Protocol... Both standards agree on two things. First, they both agree that they are focusing on transactions that span web services (what the cognoscenti would call software fortresses). Second, they both agree on the term transactions to describe the collection of work that must be coordinated across web services, although they do not agree on exactly what a transaction is... BTP is older than WS-T, but only by two months. BTP came out June 3, 2002. WS-T came out August 2, 2002. The failure of OASIS to get backing from either IBM or Microsoft and apparently only lukewarm backing from BEA must have been somewhat demoralizing, given that these three companies control virtually every business transaction processed in the world today. Then, when these three companies came out with a competing standard just two months later, the impact could only have been devastating to OASIS. These two standards leave you, the enterprise architect, in a rather difficult situation. BTP and WS-T are very different standards that do very similar types of work. Realistically, you can design your web service to be compatible with one or the other, but not both. Which should you choose? Which one is most likely to become the most widely adopted? ... Given all of my criticisms of WS-T, you might think that I favor the BTP standard. To some extent, this is true. BTP starts, for example, with a better understanding of the underlying problem that needs to be solved. BTP assumes that the transactions one wants to coordinate across web services are very different in nature than the all or nothing transactions that are the focus of WS-T's ATs. This, in itself, is a major improvement. BTP assumes, correctly, that the coordination needed between different web services (or what I would call software fortresses) is more of a contract coordination than a tightly coupled transaction coordination. And yet, even BTP has problems. Its biggest problem is, like WS-T, complexity. While it does not suffer from a confusing and useless inheritance model, it does include two different models for doing essentially the same thing with no reasonable explanation of why both are needed or even how one would choose between the two. BTP has both an 'atomic transaction' model and a 'cohesive transaction' model. The BTP atomic transaction model only sounds like the WS-T's atomic transactions (ATs). In fact, they are entirely different. BTP atomic transactions are just groupings of work that will either all complete or all fail/compensate. In this regard, they are much like WS-T's business activities (BA). My breakfast is a good example of a BTP atomic transaction..." See: (1) "Web Services Specifications for Business Transactions and Process Automation"; (2) "OASIS Business Transactions."

  • [October 07, 2002] "Perspective: The Patent Threat to the Web." By Bruce Perens. In CNet News.com (October 07, 2002). "The penultimate step in a yearlong battle over patents on Internet standards came last week, when the World Wide Web Consortium Patent Policy Board voted to recommend a royalty-free policy. The World Wide Web Consortium (W3C) draft recommendation has not yet been published. After public comments are solicited, the draft will be presented to the organization's director, Tim Berners-Lee, and an advisory board representing all of the consortium members. Large patent-holding companies may present a dissenting minority report, but Berners-Lee and the board are likely to approve the royalty-free proposal. As one of the three representatives of open source and free software on the patent policy board, I welcome the new recommendation. Developers of the Apache Web server, the Linux kernel and GNU system, and other popular free software will continue to be able to implement W3C standards in competition with proprietary software. Had the decision gone for so-called 'RAND' patents -- licensed with 'reasonable and non-discriminatory terms,' but sometimes requiring royalty payments -- the effect would have been to create a tollbooth on the Internet, owned by the largest corporations, collecting a fee for the right to implement open standards. Open-source developers, who do not collect royalties -- and thus cannot afford to pay them -- would have been locked out entirely. Smaller companies that develop proprietary software would have been at a disadvantage, compared with the largest corporations, which cross-license their patent portfolios to each other and thus would not be burdened by royalty payments. A royalty-free policy doesn't assure that standards won't be covered by patents, because a patent may become known after a standard has been adopted -- especially a patent applied for in the United States, whose government is almost unique in allowing patent applications to remain secret for an extended period. Under the new policy, the W3C would probably withdraw a standard so affected, and re-engineer it to avoid the patent... A royalty-free policy also may scare the largest companies away from participation in standards working groups, for fear that they will lose lucrative licensing rights. Working groups that create new W3C standards will require their members to file a declaration granting royalty-free rights to anyone for 'essential claims' -- patent claims that would unavoidably be infringed while implementing a standard. This will prevent 'patent farming,' the practice among patent holders of deliberately embedding their patents in new standards as a revenue-generating strategy. However, the grant of royalty-free rights does not apply to any use of the patent other than to implement the standard. Thus, patent holders will still have lots of opportunities to sell licenses to developers who wish to practice their patents for any other purpose... The W3C's decision will resolve only a single battle in a much broader war. A similar royalty-free policy must now be enacted by the Internet Engineering Task Force (IETF) and many other organizations. Some standards bodies may decide to buck the trend and act as playgrounds for large patent holders. Those organizations will argue that by allowing patent royalties, they will always be able to choose the best algorithm for any job. It will then fall to the market to decide which organizations it will follow. This battle must also be taken to the various governments and treaty organizations that produce bad law permitting the patenting of software and business systems, and continue to do so..." See: (1) "Summary of 30 September - 1 October 2002 Patent Policy Working Group meeting"; (2) W3C Patent Policy Working Group Royalty-Free Patent Policy"; (3) general references in "Patents and Open Standards."

  • [October 07, 2002] "Qwest Touts VoiceXML." By Ann Bednarz. In Network World (October 07, 2002). "Qwest has launched a development portal designed to help companies roll out voice-enabled applications easily and inexpensively. Called Qwest Development Network, the portal supplies subscribing companies with development tools, technical support and access to network services required to design, test and launch applications with interactive voice response and speech-recognition features. The portal is tied to Qwest Web Contact Center, a Web-based platform that routes and processes voice applications. By reserving voice ports on the Qwest gear, companies can avoid having to buy, deploy and maintain their own interactive voice response (IVR) hardware, says Alex Danyluk, senior director of Qwest Solutions. Development Network is based on VoiceXML ( VXML), an extension to the XML document-formatting standard that streamlines development of voice-driven applications for retrieving Web content. VXML lets users navigate Web content via telephone commands. In the same way that customers use a Web browser to access data contained in corporate directories and databases, with a VXML-based voice application, callers can retrieve data from the same sources via spoken commands or keypad entries. In the Qwest setup, the service provider takes responsibility for the heavy lifting. Qwest's infrastructure supports touch-tone detection and speech-recognition technologies from Nuance and Speechworks, and it integrates with advanced call routers such as Genesys and Cisco's ICM Computer Telephony Integration platforms. It acts as a middleman between customers and the company's Web server, translating a customer's spoken query into text, forwarding it to a corporate Web server, then converting the Web server's reply from text to speech for delivery over the telephone..." See: (1) the announcement: "Qwest Communications Launches Service To Help Customers Design Voice-Enabled Customer Service Applications."; (2) "VoiceXML Forum."

  • [October 07, 2002] "Document Object Model (DOM) Level 2 HTML Specification Version 1.0." W3C Candidate Recommendation 07-October-2002. Edited by Johnny Stenback (Netscape), Philippe Le Hégaret (W3C), and Arnaud Le Hors (W3C and IBM). Latest version URL: http://www.w3.org/TR/DOM-Level-2-HTML. The specification "defines the Document Object Model Level 2 HTML, a platform- and language-neutral interface that allows programs and scripts to dynamically access and update the content and structure of HTML 4.01 and XHTML 1.0 documents. The Document Object Model Level 2 HTML builds on the Document Object Model Level 2 Core and is not backward compatible with DOM Level 1 HTML. The goals of the HTML-specific DOM API are: (1) to specialize and add functionality that relates specifically to HTML documents and elements. (2) to address issues of backwards compatibility with the DOM Level 0. (3) to provide convenience mechanisms, where appropriate, for common and frequent operations on HTML documents. The key differences between the core DOM and the HTML application of DOM is that the HTML Document Object Model exposes a number of convenience methods and properties that are consistent with the existing models and are more appropriate to script writers. In many cases, these enhancements are not applicable to a general DOM because they rely on the presence of a predefined DTD. The transitional or frameset DTD for HTML 4.01, or the XHTML 1.0 DTDs are assumed. Interoperability between implementations is only guaranteed for elements and attributes that are specified in the HTML 4.01 and XHTML 1.0 DTDs..." Also in PDF format. See: (1) W3C Document Object Model (DOM); (2) local references in "W3C Document Object Model (DOM)."

  • [October 07, 2002] "Enabling Trusted Web Services. Embed Trust and Security Into the New Infrastructure." By Phillip Hallam-Baker. In Web Services Journal Volume 2, Issue 10 (October 2002), page 5. "Web services are demonstrating their value and exhibiting the potential to substantially enhance enterprise productivity and reduce operating costs. But they will never reach their full potential without two things: trust and security. That's because Web services are based on open, dynamic exchange of valuable data and services. But for everything to work the way it's intended, those deploying Web services must be able to ensure that the data or services being exchanged are kept confidential, secure, and reliable. To deploy trusted Web services, you really need five things: (1) High availability: The Web services must be easy to find using public or private directories. (2) Privacy: Communications absolutely must be safe from eavesdroppers. (3) Data integrity: Data exchanged by Web services must be safe while in transit. (4) Authentication: Web services must positively identify the services with which they communicate. (5) Authorization: Web services must intelligently restrict access to sensitive data and functions... There are a number of standards and specifications floating about right now that attempt to address each of these specific areas. Most notably, VeriSign, Microsoft, and IBM recently co-authored a spec called WS-Security that attempts to add a layer of security to SOAP messages. WS-Security will serve as the foundation for a number of subsequent specifications the three companies hope to sponsor, including WS-Policy, WS-Trust, WS-Privacy, WS-Secure Conversation, WS-Federation, and WS-Authorization. Some of these names may change, but this roadmap does show a strategic approach to building out the standards and technology for enabling trusted Web services..." [alt URL]

  • [October 07, 2002] "Web Services Infrastructure. Building Transactional Web Services with OASIS BTP." By Jim Webber (Arjuna Laboratory, Hewlett-Packard). In Web Services Journal Volume 2, Issue 10 (October 2002), pages 50-54. It's a fact: Web services have started to mature. Those emergent standards that once held so much promise are now actually starting to deliver useful implementations. With the basic Web services plumbing mastered, we're starting to see more advanced infrastructure, which enables these second-generation Web services to focus on complex interactions over the Internet. This article, the first of a two-part series, covers one such aspect of the second-generation infrastructure for Web services: transactions... The OASIS Business Transactions Protocol, or BTP, has become the prominent standard for Web services transactions. BTP is the product of just over a year's work by vendors such as HP, BEA, and Oracle, and has resulted in the development of a transaction model particularly suited to loosely coupled systems like Web services. In this article, we're going to look at how BTP fits into the whole Web services architecture, and how we can use one of the vendor toolkits (we'll use HP's toolkit, but the underlying principles apply to other vendors' software) to build and consume transaction-aware Web services. But before we do, let's review the architecture in the context of a simple transactional scenario. The diagram shown in Figure 1 is similar to a typical high-level Web services architecture. The only differences here are that one service, the transaction manager, has been singled out as being distinct from the other Web services (which we assume are responsible for some aspects of a business process), and the fact that we've chosen to identify two distinct categories of messages: control messages (which are used to control transactions) and application messages (which propagate application data around the system)..." See also the source code listings. See: OASIS Business Transactions TC website; local references in "OASIS Business Transactions Technical Committee."

  • [October 07, 2002] "Designing Web Services with XML Signatures." By Mark Young (Kamiak Corp). In Web Services Journal Volume 2, Issue 10 (October 2002), pages 10-12. [Note: Kamiak supports the Omniopera WSDL and Schema Editor, reviewed in XML Journal.] "XML signatures apply digital signatures to XML documents. Digital signatures let parties that exchange data ensure the identity of the sender and the integrity of the data. This last item is a benefit that physical signatures can't provide. Digital signatures don't have the legal status that physical signatures have, at least not yet. Over the last few years this has been changing, as the federal government and many state governments have moved to accept digital signatures as legally binding for some applications, where they identify the sender and provide nonrepudiation (the signer can't deny ever having signed). Since Web services that use SOAP exchange XML documents, they can include XML signatures. Web service developers who produce Web services with XML signatures will be able to realize the benefits of those signatures, and may be able to use them in place of physical signatures where the legal community has agreed they are binding. Putting XML signatures into a Web service, however, is not trivial. In this article I'll provide some background on XML signatures, and then explore how you can incorporate them into your Web services. I'll develop an example Web service, written in Java for Apache Axis, that uses an XML signature, to illustrate how it can be done..." See: (1) W3C XML Digital Signatures Activity Statement; (2) local references in "XML Digital Signature (Signed XML - IETF/W3C)."

  • [October 07, 2002] "Business Process with BPEL4WS: Learning BPEL4WS, Part 3. Activities and the in-memory model." By Matthew Duftler and Rania Khalaf (IBM). From IBM developerWorks, Web services. October 2002. Series: "Business Process with BPEL4WS." ['The recently released Business Process Execution Language for Web Services (BPEL4WS) specification is positioned to become the Web services standard for composition. This series of articles aims to give readers an understanding of the different components of the language, and teach them how to create their own complete processes. The previous parts of the series gave an overview of the language, and took readers through creating their first simple process. This part will cover each of the activities in more detail. We will also cover how the various BPEL4WS constructs may be represented and manipulated in memory.'] "Now that we've gone over the language basics (Part 1), and we have created a simple example (Part 2), let's go over how to use each of the activities in more detail. In this article, we will provide detailed descriptions of each of the BPEL4WS activities. We will also describe the in-memory representation employed by BPWS4J to represent BPEL4WS processes, and give an example illustrating the model's use... Basic activities are the simplest form of interaction with the outside world. They are non-sequenced and individual steps that interact with a service, manipulate the passing data, or handle exceptions. There are three activities a process can use for interacting with the outside world: <invoke>, <reply>, and <receive>. As we saw in the previous articles, the interactions occur with the partners of the process using these three activities. By specifying a portType, operation, and partner, each of these activities identifies the Web services call it belongs to. The <invoke> activity is used by a process to make invocations to Web services provided by partners. In addition to the portType, partner, and operation, the invoke specifies input and output containers, for the input and output of the operation being invoked. An invocation can be either synchronous (request/response) or asynchronous (one-way). In the latter case, only an input container is required. A business process provides services to its partners through a pair of <receive> and <reply> activities. The receive represents the input of a WSDL operation provided by the process. If the process needs to send back a reply to the partner who sent the message, then a reply activity is necessary. Multiple reply activities may be defined in the process to answer that partner's call; however, only one matching <reply> may become active at any one time. The matching of the appropriate reply activity is done at runtime, when the process looks for such an activity that is ready to run and has the same portType, operation, and partner as the <receive>..." See: "Business Process Execution Language for Web Services (BPEL4WS)."

  • [October 07, 2002] "Web Services Security Spec Threatened." By Darryl K. Taft. In eWEEK (October 07, 2002). "...Microsoft Corp. and IBM, which, along with VeriSign Inc., published the original Web Services-Security specification, are now in two camps that have contrasting views over what should be done with the specification, also known as WS-Security. The specification, which came under the control of OASIS, or Organization for the Advancement of Structured Information Standards, in June, defines a set of SOAP (Simple Object Access Protocol) message headers, which are designed to ensure Web services application integrity. Microsoft, of Redmond, Wash., and companies such as Iona Technologies Inc., of Cambridge, Mass., which are members of OASIS' WS-Security Technical Committee, want to push the specification through as is. They contend it is complete enough to give users the security they need now for Web services and can be improved later. However, officials at IBM, Sun Microsystems Inc., Commerce One Inc., Entrust Inc. and Cisco Systems Inc., among others -- and also part of the technical committee -- said they believe more needs to be added to the specification. A short list of additional features includes some form of extensions for WSDL (Web Services Description Language) that would enable developers to express how to control the level of encryption, the type of encryption and what gets encrypted. This faction is proposing a Quality of Protection working group to investigate what other additions the specification may need before being released..." See: (1) OASIS Web Services Security TC; (2) "Web Services Security Specification (WS-Security)."

  • [October 07, 2002] "W3C Proposes XML Encryption Methods." By Paul Festa. In ZDNet News (October 07, 2002). "The Web's leading standards group proposed two recommendations for encrypting XML data and documents, a key development in the organization's push to standardize technologies crucial to Web services. The World Wide Web Consortium (W3C) released proposed recommendations for XML Encryption Syntax and Processing and Decryption Transform for XML Signature. Together, the protocols will let Web sites and services send and receive sensitive data confidentially. While methods already exist for encrypting XML documents, the W3C's proposed recommendations will make it possible to encrypt selected sections or elements of a document -- for instance, a credit card number entered in an XML form... The Decryption Transform recommendation provides a way of determining what parts of a document were encrypted or decrypted at the time a party signed it. The proposed recommendation is crucial to letting different parties authenticate discrete sections of a document at different times. For example, a seller might sign the part listing an item and its price, while a buyer later would sign an encrypted credit card number. The Decryption Transform recommendation will let applications 'roll back' the changing document to the condition in which it was signed. The W3C's encryption work comes as part of a larger push to publish standards relevant to the Web services trend. The consortium earlier this year weathered criticism that it was missing the boat on Web services but since has published a wide array of Web services-related drafts..." See references in the news item "W3C Specifications for XML Encryption Released as Proposed Recommendations."

  • [October 05, 2002] "The Talaris Services Business Language: A Case Study on Developing XML Vocabularies Using the Universal Business Language." By Calvin Smith, Patrick Garvey, and Robert Glushko (School of Information Management & Systems, University of California, Berkeley). 27-September-2002. Revision 4. 16 pages. Published as a case study with the UBL TC Reports from the meeting in Burlington, MA, USA, 1-4 October 2002. UBL Library v. 0.65 Review Package and Supplementary Documents "The development of an XML vocabulary is a complex undertaking. This paper outlines the methodology used in the development of the Talaris Services Business Language (SBL), an XML vocabulary for web-based services procurement developed using the reusable semantic components of the Universal Business Language (UBL). UBL is still an incomplete standard, and the complete UBL model is not yet available in a readily accessible format; many of the decisions we made reflected this reality. Lastly, we discuss the design and implementation challenges we encountered and propose rules and guidelines for similar projects... The Talaris Services Business Language (SBL) is a library of XML document schemas designed to enable service procurement transactions. Companies engaged in business-tobusiness transactions exchange information to complete a transaction, such as the sale of a good, often in the form of a request document and a reply document. This information is typically entered through a web form, put into a batch file, or communicated by invoking an API. An increasingly popular form of communication and online business is through XML document exchange... XML's flexibility to describe documents of arbitrary format is its greatest strength, but it is also its greatest weakness. Since XML has no fixed semantics, it can be used to describe anything. But this means that there is no standard way to describe anything, and different vocabularies may define the same objects with different elements... A need perceived by Talaris for a standardized library of components designed for the electronic procurement of services motivated the development of SBL. In order to remain aligned with the emerging UBL standard, we chose to build SBL using UBL types. Initial SBL design work focused on package shipment and web conferencing service verticals, both of which are offered through the World Wide Web by providers such as FedEx and WebEx. In addition to components specific to these verticals, we constructed core components that could be used for services procurement in other verticals..." See: "Universal Business Language (UBL)." [source .DOC]

  • [October 05, 2002] "Voice Biometrics and Application Security. Identification, Verification, and Classification." By Moshe Yudkowsky (WWW). In Dr Dobb's Journal [DDJ] (November 2002) #342, pages 16-22. Feature Article. ['Voice-based biometric security must support identification, verification, and classification. Moshe presents a verification system in which users' voice models are stored in a database on a VoiceXML server.'] "Voice biometrics are an excellent option for application security. Voice biometrics, which measure the user's voice, require only a microphone -- a robust piece of equipment as close as the nearest telephone. In this article, I prototype an application that uses a telephone call to verify identity using freely available voice biometric resources that have simple APIs. Furthermore, the prototype can be easily integrated with Internet-capable applications... Voice biometrics provide three different services: identification, verification, and classification. Speaker verification authenticates a claim of identity, similar to matching a person's face to the photo on their badge. Speaker identification selects the identity of a speaker out of a group of possible candidates, similar to finding a person's face in a group photograph. Speaker classification determines age, gender, and other characteristics. Here, I'll focus on speaker verification resources ('verifiers'). Older verifiers used simple voiceprints, which are essentially verbal passwords. During verification, the resource matches a user's current utterance against a stored voiceprint. Modern verifiers create a model of a user's voice and can match against any phrase the user utters. This is a terrific advantage. First, ordinary dialogue can be used for verification, so an explicit verification dialogue may be unnecessary. Second, applications can challenge users to speak random phrases, which make attacks with stolen speech extremely difficult... The prototype I present uses a telephony server to connect to the telephone network, a speech-technology server, and an application server to execute my code and control the other two servers... For the telephony server, speech-technology resource server, and application server, I use BeVocal's free developer hosting. BeVocal hosts VoiceXML-based applications. VoiceXML is an open specification from the W3C's 'voice browser' working group. XML-based VoiceXML lets you write scripts with dialogues that use spoken or DTMF input, and text-to-speech or prerecorded audio for output. My scripts reside on the Internet and are fetched by the VoiceXML server via HTTP. Since the VoiceXML specification does not define a voice biometrics API, I used BeVocal's extensions to VoiceXML. Another company that offers voice biometrics hosting is Voxeo; Voxeo uses a different API. Voxeo lets you send tokens through HTTP to initiate calls from the VoiceXML server to users, which is convenient for web-based applications -- not to mention more secure, as the application can easily restrict the calls to predefined telephone numbers. Both BeVocal and Voxeo offer free technical support If your application is voice-only and over the phone, adding speaker verification is straightforward. But any Internet-capable application can add VoiceXML... Biometrics in general, and speech technologies in particular, are imperfect and have a unique capacity for abuse: Voices, faces, and other characteristics can be scanned without knowledge or consent. Still, knowing 'something you are' is a powerful security tool when coupled with 'something you have' and 'something you know'." Note: The W3C Voice Browser Working Group was recently [25 September 2002] rechartered as a royalty free group operating under W3C's [then] Current Patent Practice. See "VoiceXML Forum."

  • [October 05, 2002] "Web Services Security." By Andy Yang. In EAI Journal Volume 4, Number 9 (September 2002), pages 33-37. ['Unless you have been hiding in a cave somewhere for the past two years, you will know that XML-based Web services are poised to change the way IT is developed and deployed. But it may have an Achilles heel: security.'] "... XML Web Services interfaces are XML-based and loosely coupled. XML and SOAP let any systems communicate with each other, whether it's an Office XP desktop application or mainframe system. Over time, as there's pressure to automate business functions, there'll be a need to integrate additional diverse systems as part of a broader Web service environment. This 'ecosystem' has some or all of these characteristics: (1) Decentralized in architecture and administration; (2) Heterogeneous in implementation technologies; (3) Connections across multiple departments and enterprises; (4) Peer-based architecture; (5) Open to the public Internet. Any of these characteristics presents challenges to the overall system security: (1) How do you enforce a security policy across an entire environment with multiple heterogeneous systems?; (2) How do you ensure that security policies are enforced, particularly with 'desktop administrators?'; (3) How do you Web-enable a legacy application that was never designed to be exposed to the public Internet?; (4) How can you monitor and audit activity and administrate access across multiple heterogeneous systems?; (5) How do you protect an interface implemented using new technology that has much more functionality exposed? Just as with standard Web-based traffic, authentication, access control, encryption, and data integrity play an important role in providing basic levels of security for communication with Web services..."

  • [October 05, 2002] "Remember the Metadata." By David Linthicum. In EAI Journal Volume 4, Number 9 (September 2002), pages 8-10. ['Despite the focus on Web services and process, most application integration projects exist at the data level: It's less invasive, less expensive, and less complicated. But you must address the issue of metadata.'] "When dealing with information-oriented problem domains, we must address the issue of metadata (i.e., data about data). We've dealt with metadata for years in traditional database design and most recently, in data warehousing. So the techniques and technologies are well-understood. What's not wellunderstood is the proper application of metadata to application integration. That's the focus of this column. You can view source and target systems within application integration problem domains as information repositories, no matter if they're packaged applications, incoming eXtensible Markup Language (XML), databases, or vertical subsystems addressing requirements such as the Healthcare Information Portability and Accountability Act (HIPAA). Each source or target system has its own data and application semantics local to that system. So, we know data and application semantics differ from system to system and must be dealt with as independent information repositories. However, they must also be abstracted to a single model to use for purposes of integration, or a complete structure of all data residing within a company, including its physical and logical models -- independent of the source or target systems where the information is native. There's a larger opportunity here. Since application integration technology typically touches many systems, we have the infrastructure to create a common metadata model for the enterprise -- and for a B2B problem domain, too. Where database typically touched just a few systems, application integration could be the ultimate mechanism we employ to control and manage information integration and metadata enterprisewide. The clear benefits include: (1) Active data dictionaries that link back to physical metadata repositories in many back-end systems; (2) Better change control; (3) Less confusion about the meaning of data and its proper usage; (4) Better deployment and management of vertical standards and metadata definitions..."

  • [October 04, 2002] "Feature: Who, What, Where? The challenges of automating identity management and provisioning run the gamut from architecture to standards." By David L. Margulius. In InfoWorld (October 04, 2002). With the proliferation of Web-based applications, extranets, self-service portals, and heterogeneous enterprise systems, the development of automated identity management and provisioning systems is becoming a high priority. Almost everything's an identity-based service these days, so creating, modifying, managing, and terminating identity-based access to multiple systems -- for multitudes of employees, customers, and business partners -- has become an overwhelming task... Several standards efforts are underway to help make identity management and provisioning across diverse systems more straightforward. SPML (Service Provisioning Markup Language) is a proposed XML standard for provisioning (creating, modifying, and deleting accounts) across systems. SAML ( Security Assertion Markup Language) is a session-level request-response model for exchanging security assertions (authentications and authorizations) across corporate boundaries with other trusted security engines. In the J2EE world, Java Specification Request 115 is a proposal for how J2EE containers would support authorization and integrate with security engines, although vendors of those engines claim that the application server vendors don't really want to play ball with them. On the Web services front, Microsoft, Verisign, and IBM have introduced WS-Security, and have accepted the use of SAML within WS-security envelopes. An OASIS committee is trying to extend WSDL to include information about users and roles, which could help manage and federate identity across portals. And so it goes. Then there's Microsoft's .Net/Passport and the Liberty Alliance. Originally conceived as a consumer authentication system (as opposed to enterprise authorization), both architectures are supposedly moving toward a federated model that will facilitate identity management, if not provisioning, across extended enterprises. The Liberty Alliance has published its phase one specifications, heavily based on SAML ( Security Assertion Markup Language), which allow multiple operators to operate federated Passport-like services, cooperating with each other to allow cross-system authentication. For its phase two, Liberty is looking to create a higher-level framework that goes beyond SAML to define how security policies can be expressed between companies, how trust relationships can be established... In the meantime, Microsoft, which has been operating its Passport system as a prototype for its .Net authentication architecture, has yet to define how its technology, fundamentally based on Kerberos and built into Windows and .Net server, will interoperate with either SAML or Liberty. "I think it's gonna be a little tough" says Oblix CTO Mulchandani. "There's no representation for a Kerberos ticket in XML today." So how will Liberty and .Net/Passport get integrated into the emerging enterprise identity and provisioning architectures? It's anyone's guess..."

  • [October 04, 2002] "Hartford Accepts Exposure Schedules in XML Transaction Based on ACORD XML Large Commercial Standards." From Insurance Journal News (October 4, 2002). "The Hartford Financial Services Group has eliminated the need to rekey large amounts of property, auto and workers' compensation data as part of the application and rating process for middle market business. The Hartford is the first insurer to successfully accept exposure schedules in an XML transaction based on the ACORD XML large commercial standards. The Hartford is now also able to populate these XML transactions with data extracted from Excel spreadsheets, or directly accept XML streams from partners. Agents don't need to change their systems to take advantage of this capability. 'Insurers have made great strides in enhancing workflow for personal and small commercial insurance, but despite agent/broker requests they have not found a way to address the complexities of middle market policy applications,' Gary Thompson, senior vice president, Middle Market and Specialty Practices, at The Hartford, said. 'We have developed an infrastructure that will allow us to bring the benefits of automation to this business sector in a non-proprietary way, and we're now beginning to use it.' The Hartford is now conducting a pilot to take in auto submissions in an Excel spreadsheet, convert them to XML streams, and push them to the company's internal commercial lines rating and policy writing system..." See "ACORD - XML for the Insurance Industry."

  • [October 03, 2002] "Standards Patent Strife Spreads." By Paul Festa. In CNET News.com (October 03, 2002). "The desire to strip patented technologies from Internet standards appears to be contagious. Buoyed by signs of success within the World Wide Web Consortium (W3C), opponents of royalty-encumbered standards are pushing to change the policy within the Internet Engineering Task Force (IETF). With efforts to add an exception to the W3C's royalty-free policy at an apparent standstill, advocates see in a newly formed working group the opportunity to prevent the incorporation of patented technologies in IETF drafts. The push to spread the royalty-free religion to the IETF has cheered engineers and standards advocates who oppose the adoption of patented technologies. 'I'm hoping that the IETF follows the W3C's lead,' said Carl Cargill, director of standards for Sun Microsystems. 'You can't have a network with a lot of royalties. That has an enormously chilling effect on invention and use, because if you don't know how much you're going to be charged, you'll probably use something else.' The IETF formed its new intellectual property rights (IPR) working group following a contentious meeting in Yokohama, Japan, this summer in which participants questioned the IETF's current policy that neither restricts nor discourages the use of patented technology in standards. The meeting was called with the specific goal of clarifying certain aspects of the IETF's current policy, rather than changing it to reflect a royalty-free bias or restriction. But anti-patent speakers at the meeting were vocal in their support of the fundamental shift away from patented contributions... The effort to modify the W3C policy has withered in the heat of public protest, primarily by individual engineers and those representing smaller companies--whose patent portfolios cannot compete with the likes of Microsoft and IBM -- but also by large companies like Sun that object in principle to patents in standards. The IETF's current policy offers no particular guidance to working groups contemplating the implementation of patented technologies. Instead, it mandates that members disclose whether technologies they submit for inclusion in IETF drafts are 'encumbered' by patents. The new working group's primary task is to eliminate confusion over the disclosure requirement... 'There's a substantial part of the community that doesn't like patents at all and wants to avoid using them at all, to go back to the original standard,' said Steve Bellovin, a researcher at AT&T Labs Research in Florham Park, N.J., and co-chair of the IETF's IPR working group. 'If the WG determines that there should be a change, we'll recharter and try to broaden our membership -- this is an issue that affects the whole IETF, so we need very broad consensus'..." See the IETF Intellectual Property Rights WG Charter and IPR-WG Discussion Archive. General references in "Patents and Open Standards."

  • [October 03, 2002] "Considerations for Intellectual Property Rights in IETF Standards." By Maximilian Riegel. IETF IPR Working Group, Internet-Draft. Reference: 'draft-riegel-ipr-practices-01.txt'. July 22, 2002, expires January 20, 2003. "Intellectual Property Rights (IPR), in particular patents are important for standards development organizations (SDOs). A patent grants a monopoly to the owner to control the usage of the covered technology. If such a technology is becoming part of a standard the patent owner can arbitrarily restrict the usage of the standard. Therefore all SDOs are concerned about patents in their standards. This memo focuses on practices the IETF may apply to cope with patent rights in its standards. The proposal is based on the idea of open standards and is fully aligned to the current IPR rules of the IETF. The intention is to retain the mind of the IETF while enabling the consideration of IPR covered proposals in a defined way... Open, license-free standards have shown incredible benefits for the global economy. (1) A huge community inside and outside the IETF is supporting the idea of open source systems and open source software. They are all needing license-free standards to be able to implement them in the open source environment. (2) IETF standards specifying protocols aimed for general use in the Internet should be kept license-free to support the fast and broad adoption in the market. (3) The IETF should stay with the goal to develop unencumbered standards. It might be helpful to provide a more accessible database of IPR declarations than the current list on the IETF web-site to support implementers as well as working groups. (4) In cases where an encumbered solution might provide additional benefits for particular applications, the standard should be split up in the license-free mandatory part and optional enhancements probably covered by IPR..." See related information in the IETF Intellectual Property Rights WG Charter and IPR-WG Discussion Archive. General references in "Patents and Open Standards." [cache]

  • [October 04, 2002] "The XML Culture Clash. [Book Review]." [Reviewed] by Michael J. Lutz (Rochester Institute of Technology, Rochester, NY). In IEEE Computer Volume 35, Number 10 (October 2002), page 72. Review of System Architecture with XML by Berthold Daum and Udo Merten [San Francisco: Morgan Kaufmann Publishers, June 2002. 350 pages. ISBN 1-55860-745-5]. "... According to the authors, XML's adoption has caused a cultural clash in which document developers try to understand what a transaction is, database analysts struggle with a relational model that doesn't fit anymore, and Web designers ponder the intricacies of schemata and rule-based transformations. To rise above the confusion, the authors assert, we must understand the different semantic structures that lie beneath the XML standards and learn to model them effectively. The book describes how conceptual modeling can create scenarios consisting of business objects, relationships, processes, and transactions in a document-centric way..." Disclaimer: I have not seen this book, but the authors appear to be making a point that deserves to be repeated, insufferably, until we stop seeing (a) published XML [Xtreme Marketing Language] DTDs/schemas that represent little more than marketing scams, (b) DTDs/schemas that are destined for the basement archive because they no reasonable hope of contributing to any broad-scale interoperable computing solution. Daum and Merten apparently observe what others have observed: that "XML" makes it easy -- deceptively, seductively easy -- to create something that appears to support interop because it declares a formal, reviewable transfer syntax in a notation that even marketing droids can understand. But of course, XML is fundamentally unrelated and indifferent to the "real" interoperability challenge: ontologies, data semantics, business process. General references at "Conceptual Modeling and Markup Languages."

  • [October 04, 2002] "AOLTW Enforces Patents With Liberty Single Sign-On." By Matt Berger. In InfoWorld (October 04, 2002). "The single sign-on authentication technology under development by the Liberty Alliance Project could be bound by intellectual property restraints, despite a pledge from project founders who have said the technology will be open and royalty-free. AOL Time Warner (AOLTW), one of the members of the 120-company consortium, has claimed with the release of version 1.0 of the Liberty Alliance specification that technology it contributed to the project is patented and may be subject to special licensing requirements. From the beginning, founding members of the project have vowed to deliver a completely open and royalty-free technology that would allow compatibility between single sign-on authentication systems from a variety of vendors and Web site operators. Sun Microsystems, which spearheaded the consortium, has been the most vocal advocate of this, arguing that a royalty-free specification is vital in order to provide an alternative to Microsoft's Passport authentication technology. 'Certainly Sun's position is that any of the critical infrastructure for the Web should be available on a royalty-free basis,' said Bill Smith, director of Liberty Alliance technology at Sun. 'It's why the Internet has got off the ground. If we start seeing tollbooths and barriers put up we're going to see an impediment to growth.' ... AOLTW spokesman Andrew Weinstein said that while the company has claimed rights to certain technology in the specification, users will have free access to the specification in the first release. AOLTW has yet to decide how it will license its technology in future versions of the specification... Sun's Smith said that it will give free access to the intellectual property Sun contributed to the project with only one condition: Any companies that charge royalty fees for their contributions will have to pay Sun for its intellectual property. That means, if AOLTW is the only company to charge royalties it will also be the only company that has to pay royalties to Sun. The patents that AOLTW has staked claims to were acquired from Netscape and are common Internet technologies, such as those used to enable e-commerce , so-called 'cookies' and the security technology SSL (secure sockets layer), according to patent descriptions on file with the U.S. Patent and Trademark Office. Because these patents are so widely used on the Internet, few expect AOLTW to charge royalty fees to companies that implement the Liberty specification, said Michael Barrett, president of the Liberty Alliance board, and vice president for Internet strategy at American Express. 'If they choose to license (their patents) they could hold half the Internet for ransom,' he said..." See: (1) "Liberty Alliance Specifications for Federated Network Identification and Authorization"; (2) "Patents and Open Standards."

  • [October 03, 2002] "Working with a Metaschema." By Will Provost. From XML.com (October 02, 2002). ['Document schemas can be useful for a lot more than their primary purpose of validating document instances. For some time now, it has been popular to use a schema to construct parts of a processing application. Our main feature this week, the latest in Will Provost's XML Schema Clinic series, focuses on how schemas can be used in application construction. In particular, Will looks at how the schema for W3C XML Schemas themselves can be adapted to produce "metaschemas," allowing your applications to either restrict or extend the functionality of W3C XML Schema.'] "W3C XML Schema [here: WXS] provides a structural template that describes in detail each type and relationship: just the information an application would need, say, to build a new instance document from a data stream or to create an intuitive GUI for data entry. Given the tremendous complexity of WXS, however, applications which consume schemas face a daunting processing challenge. Often the full power of the language is neither needed nor wanted, as modeling requirements may be relatively simple, and developers don't want to be responsible for every possible wrinkle in a schema. If only we could constrain a candidate schema to use just a subset of the full WXS vocabulary... Oh, wait, we can. 'WXS vocabulary' is the tip-off: a schema is just an XML document, after all, and it can be validated like any other. What we need, in other words, is a schema for our schema. In this article we'll investigate the uses of metaschemas and the techniques for creating them. This will bring us in close contact with the existing WXS metamodel, an interesting study in and of itself. We'll consider several strategies for bending this metamodel to our application's purposes, and we'll see which strategies best suit which requirements. To tip the hand a bit, the prize will go to the WXS redefine component as a way of redefining parts of the WXS metamodel itself... This system isn't perfect. There are many ways in which I'd like to leverage the WXS metamodel that are either closed to me or just too complicated to be worth the trouble. This isn't a shortcoming in WXS, as I see it; if the type model were as pliable as I'd like it to be, it just wouldn't be W3C XML Schema and wouldn't have the tremendous descriptive power and precision that I also want. Where they are feasible, redefinitions of schema components offer an elegant way to tailor the WXS model to the needs of an application. XPath/XSLT validation can provide another option, but it's important to see past logistics and remember that the WXS metamodel is as stiff as it is for a reason. If you find yourself demanding features in your application's candidate schema that make them malformed under WXS proper, or changing so many things that the metamodel is unrecognizable, you should probably be building your metamodel from scratch or working from a different starting point." See also: (1) XML Schema: The W3C's Object-Oriented Descriptions for XML, by Eric van der Vlist; (2) "XML Schemas."

  • [October 03, 2002] "TAG Rejects HLink." By Kendall Grant Clark. From XML.com (October 02, 2002). ['Controversy has been brewing over the HLink linking specification from the W3C's XHTML Working Group. This week in the XML-Deviant column Kendall Clark reports on the debate surrounding the W3C Technical Architecture Group's rejection of HLink in favor of XLink.'] "TAG, the W3C's Technical Architecture Group, announced recently that XLink 'should be used for hypertext references in user-interface oriented applications', and that 'it is the unanimous opinion of the TAG that XLink should be used for hypertext references in XHTML 2.0.' The announcement seemed to directly repudiate HLink, a new linking solution for XHTML. Steven Pemberton, chair of the HTML WG, suggested that HLink and XLink were two ways of doing the same thing: 'The idea is that you can define linking on markup languages using XLink concepts, and you could even define XLink itself using HLink. [HLink] is not a divergence from XLink, but an enrichment...' The TAG suggested, further, that, 'given that MathML and SVG already use XLink for hypertext references, that would seem to be precedent for using XLink in XHTML.' To which Pemberton responded: the 'fact that SMIL doesn't use XLink would seem to be a precedent for not using XLink in XHTML.' In response to TAG's claim that creating HLink falls outside XHTML 2.0's charter -- one 'design goal' goal of which is 'to use generic XML technologies as much as possible' -- Pemberton says that ''as much as possible' was added exactly to cover XLink, because it prevents us from doing the things we want to be able to do.' The first public XHTML 2.0 Working Draft, released in early August, explicitly refused to use XLink to provide hypertext linking. That refusal was foreshadowed as early as the XLink Final Call when members of the HTML WG suggested that XLink had failed to meet some of its requirements, particularly those most related to HTML linking semantics. The TAG's repudiation touched off a wide-ranging conversation, including both W3C-procedural and purely technical issues, in the XML developer community. The publication of TAG's finding followed very quickly on the heels of the HTML Working Group's public release of HLink..." See "XML Linking Language."

  • [October 03, 2002] "Duplicate and Empty Elements." By Bob DuCharme. From XML.com (October 02, 2002). ['Bob DuCharme's "Transforming XML" column this month continues to guide us through our education in XSLT. In "Duplicate and Empty Elements" Bob demonstrates how to detect, delete, and create duplicate and empty elements in source and result trees.'] "Any manipulation of XML documents, whether with XSLT or not, often involves cleaning up the documents. Perhaps some company sends XML data to your company, and while it may be valid XML, it still needs to be beaten into shape a bit before your systems can use it. Dealing with duplicate elements and empty elements are typical tasks of a cleanup process. Through no fault of XSLT, finding them can be a little trickier than you might at first think, but it's not too bad, and XSLT includes several features to make these cleanup tasks go more easily... When we copy a source tree to a result tree, the task of deleting duplicate elements from the copy sounds simple: don't copy an element from the source tree to the result tree if an identical one has already been copied. But what do we mean by 'identical'? According to the XPath specification, two node sets are considered equal if their string values are equal. The string values are the concatenation of any text node descendants of the elements. Because text nodes store the character data contents of elements -- that is, the part between start- and end-tags -- two elements with different attributes or with different values in the same attribute are still considered equal, because attributes aren't taken into account in the XPath version of element equality. So, if you only want to compare element content when determining which elements are duplicates, an equals sign will do, but if you want to consider attribute values, you have to explicitly say so..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [October 03, 2002] "TAG's Iron Fist." By Edd Dumbill. From XML.com (October 02, 2002). <taglines/> Column. ['The W3C's Technical Architecture Group's condemnation of HLink has met with an angry response. Edd Dumbill says that the TAG's approach isn't good for the web or for the W3C. He muses on the current effectiveness of the TAG in the context of the coherence of W3C specifications as part of the Web's architecture and provides observations on one of the most interesting debates this year.'] "The Web and XML community were rocked recently by an uncompromising edict issued by the TAG, the W3C's Technical Architecture Group. Charged with being the guardians, documenters, and devisers of the Web's architecture, the TAG is composed of the W3C's brightest and best, chaired by W3C Director Tim Berners-Lee. The usual way the TAG reports decisions on points of architecture, called 'Findings', is by publishing them on the W3C site. However, following a face-to-face TAG meeting on 24 September, TAG member Norm Walsh sent a message entitled 'TAG Comments on XHTML 2.0 and HLink' to the www-tag mailing list [offering] a 'unanimous opinion' from the TAG that the XHTML Working Group should be made to abandon their HLink linking strategy in favor of XLink... The TAG must try harder to build genuine consensus, preferably as early as possible during a specification's lifetime, if it is to remain relevant to participants in the W3C process..."

  • [October 03, 2002] "How NTT Safely Adopted an XML Framework." By Shelley Doll. In Builder.com (October 02, 2002). "Nippon Telegraph and Telephone Corporation (NTT Group), one of Japan's foremost telecommunications companies, was looking for a way to keep up with market economic conditions and put some distance between itself and the competition. Client and vendor relationships needed more efficient communication and integration capabilities, and implementing a new technology strategy was imminent. XML looked like the perfect solution, with only one problem -- XML standards haven't been finalized yet. NTT looked to its research and development division, NTT Information Platform Laboratories, to find and test a solution... Eiji Yana, senior research engineer at NTT, recognized that NTT would undoubtedly be early adopters of XML and that moving to a technology before standards solidified carried heavy risks... On the other hand, the team's research discovered a third-party vendor, Kinzan, which provides an XML-standards-based application framework called Adaptive Web Services Suite (AWSS). This framework can be used to deploy prebuilt and custom applications, called components, and to facilitate interoperability and integration. By deploying this third-party solution, the risks involved in working with unstable standards would be deferred to the provider. Kinzan closely monitors XML developments and is involved in many of the standard's efforts. Kinzan is well positioned to negotiate the timing and appropriateness of XML deployments, removing those risks from NTT's development teams. Additionally, with Kinzan Application Framework, functionality has already been written that facilitates integration of custom development and ties components together, providing messaging, security, and presentation..."

  • [October 03, 2002] "Free WS-I Interoperability Tests Due in 2002." By Paul Krill. In InfoWorld (October 03, 2002). "Interoperability tests for Web services, to gauge the compatibility of different implementations, will be available for free from the Web Services Interoperability (WS-I) organization by the end of this year, said an Oracle official Thursday. Specifically, WS-I will provide tools to test the conformity of Web services to basic profiles, said Rob Cheng, Oracle senior product manager of Oracle9i Application Server and co-chairman of the WS-I marketing communications committee. Cheng spoke of the WS-I plan during the Web Services Edge 2002 conference here on Thursday. If a Web service passes the WS-I test, 'It means it's compatible with any other Web service,' conforming to the profile, which will feature specifications such as UDDI and SOAP, Cheng said. Also to be released by WS-I is a set of best practices for interoperability and use case scenarios, Cheng said. Supply chain is one specific sample application to be released by WS-I, to test for attributes such as asynchronous communications between a manufacturer and a warehouse, said Cheng. Users can download the tools and test applications, according to Cheng... WS-I does expect users who have passed the compatibility tests to post results in a public place, he said. The only time users will have to sign an agreement to use the tests is if they want to integrate them into an interactive development environment, Cheng said..." See: "Web Services Interoperability Organization (WS-I)."

  • [October 03, 2002] "XML Accessibility Guidelines." W3C Working Draft 3-October-2002. Edited by Daniel Dardailler (W3C), Sean B. Palmer, and Charles McCathieNevile (W3C). Version URL: http://www.w3.org/TR/2002/WD-xag-20021003. Latest Version URL: http://www.w3.org/TR/xag. Abstract: "This document provides guidelines for designing Extensible Markup Language (XML) applications that lower barriers to Web accessibility for people with disabilities (visual, hearing, physical, cognitive, and neurological). XML, used to design applications such as XHTML, SMIL, and SVG, provides no intrinsic guarantee of the accessibility of those applications. This document explains how to include features in XML applications that promote accessibility." From the Status statement: "The document is a Working Draft of the XML Accessibility Guidelines made available by the Protocols and Formats Working Group (PFWG). The PFWG operates as part of the WAI [Web Accessibility Initiative] Technical Activity. Introduction: "This document specifies requirements that, if satisfied by designers of XML applications, will lower barriers to accessibility. The introduction provides context for understanding the requirements listed in section 2. Section 2 explains four general principles of accessible design, called 'guidelines'. Each guideline consists of a list of requirements, called 'checkpoints', which must be satisfied in order to conform to this document. Section 3 explains how to make claims that XML Applications satisfy the requirements of section 2. An appendix lists all the checkpoints for convenient reference (e.g., as a tool for application developers to evaluate software for conformance)..."

  • [October 03, 2002] "Reports Made Easy With JasperReports." By Erik Swenson (Open Source Software Solutions). In JavaWorld (September 2002). ['JasperReports, a popular, full-featured open source report-generating library, uses XML report templates to generate reports you can display on the screen, send to a printer, or save as a PDF document. In this inaugural Open Source Profile column, Erik Swenson introduces the JasperReports library and explains how to integrate JasperReports into your applications.'] "Generating reports is a common, if not always glamorous, task for programmers. In the past, report generation has largely been the domain of large commercial products such as Crystal Reports. Today, the open source JasperReports report generating library gives Java developers a viable alternative to commercial software. JasperReports provides the necessary features to generate dynamic reports, including data retrieval using JDBC (Java Database Connectivity), as well as support for parameters, expressions, variables, and groups. JasperReports also includes advanced features, such as custom data sources, scriptlets, and subreports. All in all, JasperReports combines good features, maturity, community participation, and, best of all, it's free... JasperReports is a powerful report-generating tool that has the ability to deliver rich content onto the screen, to the printer or into PDF, HTML and XML files. It is entirely written in Java and can be used in a variety of Java enabled applications to generate dynamic content. Its main purpose is to help creating page oriented, ready to print documents in a simple and flexible manner. JasperReports organizes data retrieved from a relational database through JDBC according to the report design defined in an XML file. In order to fill a report with data, the report design must be compiled first..." See: (1) JasperReports Sourceforge project; (2) JasperEdit [Java/Swing based editor for the JasperReports reporting library; allows users to edit, compile, preview, and view XML JasperDesign files, and run compiled JasperReports; also includes a wizard for creating database driven reports]; (3) JEEZ Report design tools for Eclipse [Eclipse plugins that make it easier for Java developers to create a JasperReports XML file]; (4) JasperReports Tutorial.

  • [October 03, 2002] "Developing Web Services, Part 3: SOAP Interoperability." By Bilal Siddiqui (CEO, WAP Monster). From IBM developerWorks, Web services. September 2002. ['In this article, Bilal will start with a discussion of the evolution of SOAP, present a list of major SOAP interoperability issues and their details, and create a guideline to develop better interoperable Web services. Bilal will also cover the details of using datatypes in SOAP.'] "This article is the third in a series of four articles aimed at introducing the process of creating, describing and publishing Web services. In the first part, I explained how to describe a Web service with the help of WSDL authoring examples. In the second part, I discussed the architecture of SOAP and its semantics. In this installment, I'll take a look at the interoperability issues related to SOAP. The Web services model divides the entire B2B spectrum into three steps or domains: describe a service, bind it with a concrete implementation, and publish it over a registry. These three steps are independent of one another and give rise to separate interoperability issues. WSDL (Web Services Description Language) constitutes the description domain, SOAP covers the binding domain, and Universal Description, Discovery and Integration (UDDI) registries cover the publishing domain. My discussion in this part will be limited to the interoperability issues related to the binding domain. I will start by describing the evolution of SOAP, which will follow the actual SOAP interoperability issues... Interoperability problems can surface due to an engine's inability to process the mustUnderstand SOAP headers. SOAP functionality can be extended through custom headers. This extensibility also opens the route for proprietary and non-standard solutions which may ultimately create interoperability issues. SOAP allows the use of multiple return parameters which are not supported by some implementations (for example, Apache SOAP 2.2). Encoding schema describes the rules that are followed to serialize and de-serialize data to and from XML. Unfortunately SOAP 1.1 does not specify a default encoding for SOAP messages although it does propose one. SOAP implementers are not bound by the specification to implement the specification-defined encoding in order to conform. The lack of a default encoding schema can lead to interoperability problems... I can offer a little advice for developers who are planning to build Web services on present SOAP implementations or who are on their way writing their own SOAP servers and clients. (1) If you are planning to work with SOAP 1.1 implementations, keep the SOAPAction header quoted in your requests and be sure it conforms to the service's WSDL file. (2) Adopt the 2001 namespace URI for XML Schema as it is the de facto standard now. (3) Comply with the encoding defined by SOAP 1.1 and ensure that data that you send and receive is really what it should be. (4) Your method element names should conform to the WSDL description. Provide datatype information on the wire in SOAP messages. See also part 1 "Introduction to Web services" and part 2 "Simple Object Access Protocol (SOAP)." Also in PDF format. See "Simple Object Access Protocol (SOAP)."

  • [October 02, 2002] "WebWare Deals with Interwoven, Shows Fruits of WebDAV Support." By Patricia Evans. In The Seybold Report Volume 2, Number 12 (September 30, 2002). ISSN: 1533-9211. ['Interwoven will integrate WebWare's Mambo into its TeamSite package. Meanwhile, WebWare's WebDAV support is starting to pay off.'] "This month, WebWare announced an integration deal with content-management vendor Interwoven. It also debuted enhancements to its Mambo asset management system... According to WebWare, the integration will enable Interwoven clients to take advantage of features such as automatic asset versioning, tracking and routing of assets and will offer better support for video and audio assets... WebDAV support: it was at Seybold San Francisco last year that WebWare first declared support for WebDAV and for Adobe's newly announced XMP metadata-sharing platform. But Adobe applications (to say nothing of third-party products) that produced XMP proved to be a little farther down the development road. Not until early this year did Adobe release XMP- and WebDAV-capable versions of Illustrator, GoLive, InDesign and Photoshop. WebDAV-enabled software allows users to access the Mambo repository directly from the application. For example, a Photoshop user could retrieve an asset, edit it and return the updated version to the Mambo repository without leaving Photoshop. WebWare says it currently supports eight WebDAV-enabled applications." See: (1) "WEBDAV (Extensions for Distributed Authoring and Versioning on the World Wide Web"; (2) "Extensible Metadata Platform (XMP)."

  • [October 02, 2002] "XML Powers Document Builders." By Peter Coffee. In eWEEK (September 30, 2002). "With last week's shipment of Corel Corp.'s Ventura 10, the polished production of structured documents from XML data sources becomes a two-horse race. Adobe Systems Inc.'s $799 FrameMaker 7.0, released in May, was quicker out of the gate and can run on more courses, with versions for Mac OS and Unix as well as Windows 98 or later. However, the $699 Ventura 10 has integrated task-automation tools that will often get it across the wire first when running on its home turf of Windows 2000 or Windows XP. These products differ greatly in their view of how XML capabilities are best integrated with their core functions. FrameMaker might seem to have the advantage, with import and export options for XML, while Ventura offers only XML import. In most shops, however, we believe that Ventura will have the edge in practice, with its powerful and intuitive XML mapping editor that enables structural translation and condi-tional formatting of incoming content based on XML tags and rules. FrameMaker's XML interaction depends more on outside help. The product's manual instructed us, 'You can open and work with any structured file ... as long as the file has an associated application. ... A developer typically sets up this application for you.' Without that developer support, the FrameMaker manual warned, 'You may not be able to save ... to the structured format ... because some mapping information may be unavailable.' ... Adobe's claim of 'XML roundtripping' is, therefore, somewhat exaggerated. We recommend Ventura to smaller departments that don't have developers on call and that are interested mainly in automating the flow of data into documents -- with the caveat, noted earlier, that Ventura is offered only for Windows 2000 and Windows XP... Adobe's FrameMaker 7.0 combines proven capability in document production with new facilities for consuming XML-formatted data and for producing device-neutral output that's suitable for handhelds and Web clients as well as print. Its multiplatform flexibility suits the needs of the graphics and engineering environments that are most likely to be using workstation operating systems other than Windows, and it also maintains support for the many end users still running Windows 98... Corel's Ventura 10 asks and answers the question, 'How can a high-end document production tool become a more productive component of enter-prise IT?' Its integrated XML mapping editor streamlines the input side of the process, while PDF publishing capabilities (although not as device-neutral as those of Adobe's FrameMaker) add value to the output. Limitation to Windows 2000 and XP may discourage its adoption by many graphics professionals and by end users still on Windows 98, but it may attract more interest as a complement to Microsoft Word in many department- and enterprise-level tasks..."

  • [October 02, 2002] "Roots of the REST/SOAP Debate." By Paul Prescod (ConstantRevolution Consulting). Paper presented at 2002 Extreme Markup Languages Conference 2002 in Montréal. "In order to communicate over networks we need standardized data formats and protocols. But how do we move forward toward this goal? One popular debate centers around the best way to define new data formats. XML dominates this area and so the primary question left is how and whether to use schemas and if so, what schema language to use. This paper will address a different question: How will we standardize the protocols used to transport the XML documents? The paper describes three different strategies and attempt to summarize their strengths and weaknesses from an admittedly partisan point of view." Discusses: The Custom Protocols Approach, The Framework Approach, and The Horizontal Protocol Approach.

  • [October 02, 2002] "Model-Driven Development of Dynamic Web Applications." By Hideki Tai, Takashi Nerome, Mari Abe, and Masahiro Hori (IBM, Tokyo Research Laboratory). Paper presented at 2002 Extreme Markup Languages Conference 2002 in Montréal. "The World-Wide Web is increasingly used as a means for accessing on-line applications, although the Web was initially created to access documents. In contrast to data-intensive Web sites, dynamic Web applications are Web-based business applications that execute business logic for changing the business state captured by the system. Due to the central roles of business logic, it is important for the dynamic Web applications to be modeled focusing not only on the client-side presentation, but also on the server-side processing such as the execution of business logic and page-flow control. This paper proposes a Web Application Descriptor (WAD) for modeling dynamic Web applications, and presents a prototype of the Web Application Development Support Tooling (WAST), which is an [open tool] environment for developing Web applications based on the WAD model. WAD specifies the navigational structure of Web applications, and prescribes the behavior of Web applications in terms of logical pages and actions used for directing the page flow. WAD allows modeling such page navigation flow at a description level independent of both the content view and the runtime application framework, so that it can be exploited for the conceptual design and execution of dynamic Web applications... In order to reduce the complexity of interaction between the server pages, it was proposed to decouple the view components (i.e., server-side pages) from the server-side engine (e.g., Servlet), and place responsibility for page-flow control in a central controller. This design is known as the Front Controller pattern, which centralizes control providing a central place to handle system services and business logic across multiple requests. As an architectural pattern, this design constitutes a Model-View-Controller (MVC) pattern for dynamic Web applications, where the model encapsulates the application state, the view provides the presentation of the model, and the controller handles the incoming requests. A number of runtime frameworks based on the MVC are emerging as means for the development dynamic Web applications... Note that although the [here-discussed] WAD model and WAST workbench rely on the MVC architecture, they are independent of any particular implementation of runtime frameworks. The WAST workbench is realized as a tool framework that can be customized for different MVC-based runtime frameworks... we briefly introduce the WAD model focusing on the navigation flow modeling, and give an example of a simple shopping application described with WAD... the WAST workbench provides a visual editor for authoring the navigational structure of a Web application, as well as menu commands for code generation. Since the WAST workbench currently supports code generation for the Apache Struts framework, examples of the generated artifacts are used as deliverables for the Struts framework. Finally, we compare our approach to the related work on Web application modeling..."

  • [October 02, 2002] "Topic Maps for Managing Classification Guidance." By Vinh Lê (U. S. Department of Energy, Office of Security, Information Classification and Control Policy -SO-12) and James David Mason (Y-12 National Security Complex). Paper presented at 2002 Extreme Markup Languages Conference 2002 in Montréal. "The Department of Energy (DOE) and its predecessor agencies have had a long history of developing and handling sensitive, classified information. Since the Manhattan Project, identification of classified information has been supported by principles and rules, called 'classification topics,' in guidance documents that are published for use by authorized classifiers across the DOE complex. Managing DOE classification guidance adds to the usual problems of document management. Coordination of multiple dependencies among master and derived documents and classification topics is needed to ensure consistency. The Topic Maps technique is being applied to organize classification guidance topics according to unique subjects so that duplicate topics, inconsistent topics, and gaps in classification guidance become obvious for corrective actions. Once topics are logically organized and linked, then when a change in a guidance topic is proposed, system users will know which related topics need a review or change. Development of an overall guidance-management system also involves creation of a new XML-based publishing system to replace the diverse and often inadequate tools used in the past. A document-management system to support the publishing system will require integration with the guidance-management system in addition to the conventional tools for revision control and file management. We have begun to construct the topic maps for guidance management and are in the process of refining a design of topic maps to manage the publishing process... We have been able to demonstrate the map, and we already have feedback from the user community about things they would like to see presented differently. We are now designing new HTML and JSP pages to replace the generic interface. As part of the Ferret project, Y-12 developed a simple editor for knowledge bases. Because of the similarity between the Ferret knowledge base structure and this topic map, we are modifying the editor to write out topic-map components. The next step will probably be to implement a topic-map editor, since the files in question are not conducive to easy editing in a conventional XML editor. In the next year, besides the considerable redesign of the interface, we expect to start on the implementation of the content-management system and the extensions to both the topic map and the interface to support it..." See: "(XML) Topic Maps."

  • [October 02, 2002] "Core Range Algebra: Toward a Formal Model of Markup." By Gavin Thomas Nicol (Red Bridge Interactive). Paper presented at 2002 Extreme Markup Languages Conference 2002 in Montréal. "There have been a number of formal models proposed for XML. Almost all of these models focus on treating XML as semistructured data, typically as edge-labeled graphs, or as a tree of nodes. While this approach is fine for more traditional database applications, or situations where structure is of paramount importance, it fails to deal with the issues found in the use of XML for text. In particular, unless special functions are introduced, queries that involve text spanning node boundaries are not closed over a set of nodes, and likewise, the returned sets of nodes are not necessarily well-formed XML documents. This paper presents an algebra for data that, when applied to XML, is not only closed over the entire set of operations permissible in more traditional XML query languages, but over the operations that involve text and XML fragments that are not in themselves well-formed... Core Range Algebra, as defined in the paper, is a simple algebra that operates over sequences, ranges, and sequences of ranges. The intent of Core Range Algebra is to provide a basis for manipulating sequences of data, especially text, as a sequence of items, and to provide means for layering higher-level interpretations over that substrate such that operations at the higher level can be defined in terms of the underlying model. Future Work: What is woefully missing in the existing specification is a means for constructing sequences of ranges from an underlying sequence. Given the Kleen closure, we have a formal basis for creating ranges based on regular expressions. This will not only provide a powerful means for creating ranges over unstructured texts, but also gives a means for inferring structurally recursive data structures, or to infer higher level structures over previously inferred structures. One obvious further extension is to expand the Core Range Algebra to fully cover XML. An approach much like REX or Regular Fragmentation can be used to build the structures of XML, from the underlying sequences, and with additional functions for range construction such that elements, attributes, etc., can be constructed, a full query language can be defined over the Core Range Algebra defined above..." See the 2002-10-02 news item "New Website for Layered Markup and Annotation Language (LMNL)."

  • [October 02, 2002] "Just-In-Time-Trees (JITTs): Next Step in the Evolution of Markup?" By Patrick Durusau (Society of Biblical Literature) and Matthew Brook O'Donnell (OpenText.org). Paper presented at 2002 Extreme Markup Languages Conference 2002 in Montréal. "Recording multiple possible encodings for a text has been treated as a problem of syntax (CONCUR, milestones, stand-off markup, Bottom-Up Virtual Hierarchies, Layered Markup and Annotation Language, MECS/TexMECS, etc.) and as a problem of parsing (Earley's algorithm, active chart parsing, tree-adjoining grammars). While offering various advantages and shortcomings, these efforts fall short of isolating the fundamental difficulty that gives rise to the problem. The simple tree model, which XML enforces, is a symptom and not the cause of this problem. Descriptive markup divides texts into content and markup. Or as previously stated by the authors, 'Markup is metadata about #PCDATA.' The fundamental problem is that all prior methods treat markup as static trees of metadata about #PCDATA. If that changes to: Trees are assertions about metadata, the fundamental difficulty of representing multiple trees in a single document instance resolves into a processing issue... The power of descriptive markup lies in its separation of the structure of a text from processing of the text. That separation allows the structure of the text to be entered only once, despite numerous or perhaps conflicting decisions about the ultimate processing of the text. The solution to representing multiple structures in a given text is to separate the assertion of trees from the markup in a text. SGML and XML, allow only static trees to be asserted using markup. This results in the single tree (more than one with CONCUR but the results are the same), which has been challenged in a variety of forums but the problem is not the single tree but that all the trees are static. JITTs relies upon the assertion of a tree based upon markup in a text to construct a tree (one of many possible trees). To demonstrate this assertion process, the JITTs model is used to construct multiple trees from a single document containing markup. The markup selected for the tree must obey all the usual rules for tree construction and markup that is not selected, is simply ignored. A traditional example of overlapping markup from the TEI Guidelines is presented and solved using the JITTs model described in more detail below. Using the concept of tree assertion, a processing model is proposed for documents containing markup. Finally, the broader implications of the tree assertion model for markup are discussed..." Note from Patrick Durusau: "Along with the paper, you will find an implementation of the JITTs approach using XSLT and the latest version of Saxon (7.2 or later). This method asserts trees while processing markup. Markup that is not part of the asserted tree is ignored. Multiple or overlapping trees may be asserted about a single text. Asserted trees are valid XML trees. Additional sample texts, implementations and papers will be posted at this webpage. Anyone interested in modifying an XML parser to use the JITTs approach, please contact the authors. Contributions of texts, software, comments and suggestions are welcome." See "Markup Languages and (Non-) Hierarchies."

  • [October 02, 2002] "RSA Helps To Secure Web Services." By Richard Karpinski. In InternetWeek (October 01, 2002). "RSA Security on Tuesday said it has begun shipping a new software development kit (SDK) for adding XML-based digital signatures to Web services. The SDK, dubbed RSA BSAFE SecureXML-C, can help assure the authenticity and integrity of a Web services message. That's important because the core Web services protocols, such as SOAP, lack built-in security and authentication. And since they are delivered in the clear over HTTP, they are open to interception or tampering.... RSA BSAFE SecurXML software can also play role in platforms for federated identity management, such as the Liberty Alliance Project, by signing SAML assertions and thus providing single sign-on capabilities..." See: (1) the announcment: "RSA Security Helps Customers Develop and Implement an Effective Security Strategy for Web Services. Newly released RSA BSAFE SecurXML software enables developers to build XML-based digital signing applications within a Web services environment."; (2) "Security Assertion Markup Language (SAML)."

  • [October 02, 2002] "At the Wireless Edge." By Glenn Fleishman. In InfoWorld (September 27, 2002). ['The use of XML-based protocols to transmit configuration and other information using commodity access points, peer-to-peer access points, or even dumb radios across large networks might obviate the need for central controllers and offer more flexibility and a lower cost for deploying newer network standards as they become available. '] "As wireless lans grow in size and complexity, and IT managers wrestle with the twin burdens of staff time and security when configuring and maintaining wireless APs (access points) across their enterprises, a previously unnecessary category of network management solution has started to proliferate: the central WLAN controller. At the same time, wireless vendors are exploring the use of XML-based protocols, peer-to-peer communications, and even radio itself to ease WLAN management. The first two of these approaches address current limitations in central controllers, whereas the third takes central control to the extreme Central Control and the Sputnik APs interact using the open-source Jabber IM protocol. Using XML streams, the controller and APs exchange authenticated, encrypted messages that allow the controller to sense the presence of newly installed access points, pass them configuration info, and detect problems on the WLAN. Proxim and Cisco are also looking at including XML communication protocols in future versions of their controllers. According to Sputnik, SOAP and other XML-based protocols could be embedded into these streams, allowing any kind of XML document to be sent. For example, a network management app could leverage XML to allow administrators to update user authentication information to a set of access points. The application could add a user's information, including his or her group memberships, to an internal database or management system, and then use SOAP to communicate relevant details to the central controller, which would in turn push out revised authentication lists and other details. Individual access points supporting XML communication would also allow custom controller software to be written from scratch based on an organization's specific needs, as well as allow the deployment of heterogeneous access points. Today, SNMP's limitations make it less than ideal for this kind of control and communication..."

  • [October 01, 2002] "Excelon Bulks up XML Support." By Darryl K. Taft. In eWEEK (September 30, 2002). "Excelon Corp. on Monday announced greater XML and Microsoft Corp. .Net support in new versions of its XML database engine and integrated development environment. Excelon, of Burlington, Mass., announced the availability of its Extensible Information Server (XIS) Version 3.12. XIS 3.12 adds support for Microsoft Corp.'s .Net interoperability, including interoperability between the .Net and Java 2 Enterprise Edition platform, said Coco Jaenicke, XML evangelist at Excelon. XIS 3.12 also includes support for XQuery -- the XML query language -- including XQuery use cases for better access to metadata in XML business documents... Excelon on Monday also unveiled Stylus Studio Version 4.5, an update of its XML IDE for building applications based on XML and the XSLT (eXtensible Stylesheet Language Transformations) standards. Stylus Studio 4.5 provides automatic generation of XSLT style sheets from HTML pages, Jaenicke said. In addition, the product features enhanced debugging, including new support for debugging .Net XSLT processors and Saxon XSLT processors..." See: (1) "eXcelon Announces Stylus Studio 4.5. Latest Release of Industry's Most Comprehensive XML/XSLT Development Tools Enable Teams to Rapidly Build and Debug Applications Bridging J2EE and .NET Environments."; (2) "eXcelon Announces Latest Release of eXtensible Information Server (XIS). New Version of Native XML Database Enables Application Interoperability across .NET and J2EE Environments, Adds XQuery and Verity Text Searching Support."; (3) "XML and Databases."

  • [October 01, 2002] "Case-insensitive enumerations. A simple, automated solution for allowing both upper- and lowercase letters." By Doug Tidwell (Senior Programmer, IBM). From IBM developerWorks, XML zone. October 2002. ['IBM's own XML ace Doug Tidwell offers one curious reader an automated solution for defining a case-insensitive enumeration that's straightforward, standards-compliant, and requires little work on the developer's part.'] Q: Is there any way to do case-insensitive enumerations in XML Schemas? If the valid values for an element are "red," "blue," and "green," we'd like to let our users use any combination of upper- and lowercase letters for those values. We can't find any way in the XML Schema spec that we can define an enumeration that is case-insensitive. Can you help us? A: ...good news and bad news. The bad news is that you can't do what you want with XML Schema; the good news is that we have an automated solution that's standards-compliant, fairly simple, and shouldn't require any work on your part... Our solution is relatively simple, works automatically, and is based on XML standards... The article includes source code and samples.

  • [October 01, 2002] "Guide to the XML Schema Specifications." Techquila's Topicmap-powered browser for the W3C XML Schema Specifications. By Kal Ahmed (Techquila). October 01, 2002. "The W3C XML Schema standards are often accused of being over-complex and difficult to read. In an attempt to assist those trying to find their way around the W3C specifications, I have created a multi-modal topic map of the specifications. In this topic map you will find indexes of the terms used by the specifications and the main concepts of XML Schema. The topic maps are primarily created automatically using MDF to process the XML Schema specifications and the schema for XML Schema. The topic maps are then integrated by merging them with a hand-crafted topic map created using TMTab. A static HTML site has been created from the topic maps and can be browsed [online]. The application that produces this HTML output is based on TM4J and Jakarta Velocity. For more information about the creation and publication of topic maps using open-source and free software; or to get the topic map files themselves, please contact Kal Ahmed directly..." [From the posting: "I've spent a little time creating topic maps from the XML Schema family of specifications. The HTML-ized result is now [online]... This is a first stab at trying to topicmap the specs so comments on both presentation and content would be very welcome. For the topic map nerds, the XTM files are available; send me an email if you would like them. My thanks to Jeni Tennison who provided some really helpful hints in getting me started on this project (though all the mistakes are mine!)."] For schema description and references, see "XML Schemas"; see also "(XML) Topic Maps."

  • [October 01, 2002] "XML Glossary: XML Acronyms for Java Developers." By Joe Walker. In Java World (September 2002). "With XML evolving at a rapid pace, many developers get lost in a sea of acronyms. And since XML is fast becoming the way to exchange data, chances are you will need first-hand XML knowledge. Keeping up with XML means knowing when XForms, WSDL (Web Services Description Language), or RDF (Resource Description Framework) can help, or whether to use FO (Formatting Objects) or SVG (Scalable Vector Graphics), and if XQuery is worth considering. To help you better understand and use XML, Joe Walker defines many XML technologies crucial to Java developers.. The glossary covers 34 different XML technologies with enough detail to help you know which technologies relate to any given problem, as well as how they interconnect. More importantly, by reading this glossary, you understand better how to use them..."

  • [October 01, 2002] "XML, The Model Driven Architecture (MDA), and RDF." By Uche Ogbuji (Fourthought, Inc). Paper presented at XML Europe 2002 (Barcelona, Spain). "These are exciting times for application development: in many ways there is a fundamental shift in how developer perceive of applications. And XML is one of the main axes of this re-conception. XML promises to give a powerful framework for dealing with semi-structured metadata in applications. At the same time, the Object Management Group (OMG) is shifting its world view to a focus on the abstract models that underlie computer applications. This new approach to development: the Model-Driven Architecture (MDA) promises to build a bridge from successful design technologies such as UML to the very concepts that underpin the uses we have for computers. The Resource Description Framework (RDF) is another technology positioned to change the way concepts are synthesized into applications, both in localized data-driven systems and in the future Semantic Web. This presentation will discuss how these three technologies come together to provide powerful patterns for solid and maintainable software design... The OMG acknowledges that this is an age where systems integration often occurs at Internet scope, and where 'virtual enterprises' often encompass the roles of multiple development with completely different chains of responsibility (for instance, each partner in a commerce supply-chain has to synchronize certain concepts according to contract, but develops separate software for doing this). Building on lessons from the pioneers of development within this expanded horizon, the OMG has developed the Model-Driven Architecture (MDA) as the method for projects in this new generation. The MDA separates the fundamental understanding of concepts in development projects from the respresentation of these concepts for particular implementations technologies. This means even separating this abstract modeling layer from the object-oriented orthodoxy, which has been the bedrock of the OMG in the first place. Basically, the OMG is admitting that no model tainted with implementation considerations can survive the disjoint between multiple development groups, or different processing environments, as are common in Internet integration and establishing vertical alignment. The UML, despite its most common use in object-oriented modeling, is designed to be a much broader modeling language, and has been put to the test in at least one very prominent Internet-scale model that requires far more sophistication that mere objects do not approach. The Common Information Model (CIM), which models concepts related to the management of technological assets, is expressed in a UML that would probably be entirely alien to most developers, and this flexibility of UML is what the OMG uses to justify its being the language of the most abstract modeling under the MDA. This paper will take a brief look at how XML, RDF and MDA come together to revolutionize modeling for software development..." Also available in PDF format. See: "OMG Model Driven Architecture (MDA)."

  • [October 01, 2002] "Tip: Multi-pass XSLT. Use the node-set extension to break down the XSLT operation into two or more passes." By Uche Ogbuji (Principal Consultant, Fourthought, Inc). From IBM developerWorks, XML zone. September 2002. ['Transforms can often be made cleaner and clearer if executed in phases or passes. First some intermediate output is produced, and then this is further transformed into a final output form. There can even be more than one intermediate form. In this tip, Uche Ogbuji discusses ways of breaking down XSLT operation into two or more clear passes of transformation using the common node-set extension.'] "UNIX users are very familiar with the idea of pipes -- mechanisms that direct the output of one program so that it becomes the input for another. Pipes are behind perhaps the first major examples of modularizing loosely-coupled code. Each UNIX command is very simple and targeted; complex actions are produced by stringing them together. The processing of XML using XSLT has much to gain from this same sort of modularization. You can improve code simplicity and reuse by breaking the transform into a set of separate phases or passes. Unfortunately, in pure XSLT 1.0, most of the commands for handling input of a transform are forbidden from use on output. This restriction has been removed in XSLT 2.0, and even in XSLT 1.0 (which has many more years of life) you can remove the restriction using an extension function that is usually provided by XSLT vendors... This example is an extreme simplification of a situation I encountered in a real project. I needed to reuse many DocBook processing templates on an XHTML source. By transforming XHTML content to DocBook in the first pass, and then re-using standard DocBook templates in subsequent passes, I saved a huge amount of work and debugging. The idea of multi-pass XSLT is even more general than this. In addition to promoting code reuse, it can also break complex transforms into chunks that are easy to understand. The next time you are faced with a complex problem in XSLT, determine whether it could be simplified or modularized as a series of piped operations..." Article also in PDF format. For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [October 01, 2002] "URN Namespace for NewsML Resources." By David Allen (IPTC). IETF Network Working Group, Internet-Draft. Reference: 'draft-allen-newsml-urn-rfc3085bis-00.txt'. September 24, 2002; expies February 25, 2003. This document describes a URN (Uniform Resource Name) namespace for identifying NewsML NewsItems and NewsML related XML Schemas. A NewsItem is an information resource that is expressible as a NewsML element within a NewsML document conforming to the NewsML Document Type Declaration (DTD) as defined by the International Press Tele- communications Council (IPTC)... NewsML is an XML format for packaging multimedia news resources. It has been created under the auspices of the International Press Telecommunications Council (IPTC), and version 1.0 was approved by the IPTC on 6 October 2000. The same logical NewsItem may exist in multiple physical locations. The NewsML specification allows NewsItems to have multiple URLs, but only a single URN. It is the latter which then uniquely names the resource. This document obsoletes RFC 3085... This namespace specification is for a formal namespace. Reasons for Issue of New RFC: RFC 3085 issued in March 2001 specified the URN to be used when identifying individual NewsML NewsItems. Since that time the use of XML Schemas has become more common and it is now necessary to provide a modified form of the original newsml URN to enable individual Schemas to be properly identified. This new RFC seeks to address this issue by (1) Introducing a new alternative form of the newsml URN definition; (2) Updating the original definition to meet the new RFC template recently issued by the IETF (RFC YYYY)..." See "NewsML."

  • [October 01, 2002] "Componetizer: A Tool for Extracting and Documenting XML Schema Components," By Marcel Jemio and Alan Kotok (Data Interchange Standards Association - DISA). DISA Technical Note. September 2002. ['DISA's first technical note describes the Componetizer, a tool developed by DISA to extract data items from XML Schemas and array them for visual display in tables or for further processing in databases.'] "The Componetizer solves the problem of identifying and extracting individual data items or components from the electronic rules for structuring XML documents, called XML Schemas. Since business processes can easily become large and detailed, the schemas representing those processes can also become lengthy and complex... For business applications, XML Schema opens up a wide range of options. It makes XML more adaptable to business databases that use relational or object-oriented structures, as well as offering the ability to support more types of data, including those defined by the users themselves. But all of these features come with a cost, namely more complexity. And while XML Schema offers a more powerful set of tools for business, using those tools requires more intensive hands-on management by implementers and developers. One of the more demanding jobs in writing XML schemas is preparing the associated documentation that implementers and developers need to understand their structure and contents. While tools exist to help develop schemas, few if any tools are available to extract the individual data items from schemas and present them in an easy-to-read and understandable form. XML editors have features that generate documentation directly from the schema files. While the documentation produced by commercial editors is often comprehensive, it can also get also voluminous and contains much more than a simple table of components. The development of XML vocabularies, a job normally undertaken as a collaborative process, requires a way of capturing and documenting components from multiple schemas in a direct and simple way. That is the function provided by the Componetizer..." See also the announcement: "Data Interchange Standards Association Develops New Software Tool to Ease XML Schema Documentation." General references in "XML Schemas."

Earlier XML Articles


Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/xmlPapers200210.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org