The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: December 31, 2002
XML Articles and Papers December 2002

XML General Articles and Papers: Surveys, Overviews, Presentations, Introductions, Announcements

Other collections with references to general and technical publications on XML:

December 2002

  • [December 31, 2002] "XBRL International in 2002 and Beyond." By Walter Hamscher (for the XBRL International Steering Committee). 7 pages. Cross-posted 2002-12-31 with the following: "Our mission at XBRL International 'Transforming Business Reporting' took on new urgency and importance in 2002. High profile financial abuses in the United States and elsewhere brought unprecedented international focus on accounting practices and the need for greater transparency in financial reporting. The XBRL International consortium represents over 170 organisations around the world committed to transparency and the efficient transfer of information. In that spirit, the attached paper describes our successes over the past year, and also the challenging tasks facing us that we must execute on successfully to maintain our tremendous momentum..." The Australian Prudential Regulatory Authority (APRA) is one of two XBRL "live applications, using XBRL Version 1 in its system for collecting data from pension funds." See the APRA XBRL v2 Taxonomy Files and Statistics Project - Data Submission - XBRL." Wacoal (leader in designer intimate apparel in the United States, Japan, Asia and Europe) demonstrated "the first live production application of XBRL GL; they use XBRL GL, the Journal Taxonomy, for exchanging financial data among their internal reporting and business systems including purchasing, payroll, sales management, inventory management, and workflow systems." UK Inland Revenue, Edgar Online, the US Federal Deposit Insurance Corporation (FDIC), Sumitomo Mitsui Banking Corporation, The Tokyo Stock Exchange, and the National Tax Agency of Japan are now committed to the use of XBRL Version 2 in 2003. From the XBRL Home Page: "XBRL is an XML-based, royalty-free, and open standard... delivering benefits to investors, accountants, regulators, executives, business and financial analysts, and information providers...Extensible Business Reporting Language (XBRL) brings the publication, exchange, and analysis of the complex financial information in corporate business reports into the dynamic and interactive realm of the internet. XBRL provides a common platform for critical business reporting processes and improves the reliability and ease of communicating financial data among users internal and external to the reporting enterprise." See also the recent XBRL Progress Report. General references in "Extensible Business Reporting Language (XBRL)."

  • [December 31, 2002] "Introduction to SALT. Unleashing the Potential" By Hitesh Seth (ikigo, Inc). In XML Journal Volume 3, Issue 12 (December 2002). ['Speech Application Language Tags (SALT) is a set of XML-based tags that can be added to existing Web-based applications, enhancing the user interface through interactive speech recognition. In addition, SALT can be used to extend Web-based applications to the telephony world, thereby providing an opportunity to unleash the potential of a huge user community, users of normal touch-tone telephones. SALTforum, an organization founded by Microsoft, Cisco, SpeechWorks, Philips, Comverse, and Intel, has spearheaded development of the SALT specification, now in its 1.0 release.'] "... Multimodality means that we can utilize more than one mode of user interface with the application, something like our normal human communications with each other. For instance, consider an application that allows us to get driving directions - while it's typically easier to speak the start and destination addresses (or even better, shortcuts like 'my home,' 'my office,' 'my doctor's office,' based on my previously established profile), the turn-by-turn and overall directions are typically best viewed through a map and turn-by-turn directions as well, something similar to what we're used to seeing on MapQuest. In essence, a multimodal application, when executed on a desktop device, would be an application very similar to MapQuest but would allow the user to talk/listen to the system for parts of the application's input/output as well - for example, the starting and destination addresses... SALT has been built on the technology required to allow applications built using SALT to be deployed in a telephony and/or multimodal context... It's clear that SALT and VoiceXML have utilized the Web application delivery model as an open platform for delivering telephony applications. However, VoiceXML and SALT have different technical goals - whereas VoiceXML tends to focus on development of telephony-based applications (applications used through a phone), SALT focuses on adding speech interaction and telephony integration to existing Web-based applications and enable true multimodality. In this case, I would also like to highlight the development of another upcoming standardization initiative called X+V (which stands for XHTML+Voice), an effort to combine VoiceXML with XHTML to develop multimodal applications. Another difference between SALT and VoiceXML is the overall approach that has been utilized to develop applications. Whereas VoiceXML is pretty much declarative in nature, utilizing its extensive set of tags, SALT is very procedural and script oriented, having a very small set of core tags. Also, it's important to understand that SALT actually utilizes key components of the standardization effort carried at the W3C Voice Browser Activity, including the XML-based Grammar Specification and the XML-based Speech Synthesis Markup Language. Both these specifications have been used by the VoiceXML 2.0 specification as well..." See: "Speech Application Language Tags (SALT)." [alt URL]

  • [December 31, 2002] "Digital Signatures and Encryption." By Ari Kermaier and Joe Morgan (Phaos Technology Corp). In XML Journal Volume 3, Issue 12 (December 2002). ['While the flexibility, extensibility, and simplicity of XML have made life easier for a lot of business application developers, the core XML specification doesn't address security in any way. The openness of XML doesn't make security any easier, and developers who leverage the benefits of XML are still left with the daunting task of finding a way to secure their data. While data security can have various meanings in different situations, the fundamental ideas that concern us here are the integrity and confidentiality of the data.'] "Our purchase order example provides a basic introduction to the capabilities of XML signatures and encryption to secure XML communication. The specifications' richness and flexibility allow them to be adapted for virtually any XML security need, from database encryption to secure workflow applications to the exchange of authorization credentials. The XML signature and encryption specifications have become the foundation of a number of other valuable XML security protocols, including: (1) XKMS: The W3C's XML Key Management Specification provides an XML Web services framework for relieving client software of the complexities of traditional PKI; (2) SAML: The OASIS consortium's Security Assertion Markup Language uses XML signatures and encryption to protect the exchange of security credentials for single sign-on and other applications; (3) SOAP-SEC: The SOAP Security Extensions specification details how XML signatures can be used to secure SOAP messages. Securing your data can be a complex undertaking. The example we have provided here is intentionally simplified, but it demonstrates the potential of the XML signature and encryption standards, and the ease with which these standards can be utilized with the Phaos XML Security Suite..." See the XML Security Specifications list. [alt URL]

  • [December 31, 2002] "EPA Simplifies Multistate Data Exchange with XML. A Pilot Project" By Casey Hibbard. In XML Journal Volume 3, Issue 12 (December 2002). ['A massive government agency isn't the first place you'd think of to find leading-edge use of XML. But then that's just the type of setting where the emerging standard for cross-platform data exchange is needed most.'] "In each of the 50 [US] states, there is at least one environmental agency responsible for enforcing state and U.S. Environmental Protection Agency regulations. From the corner gas station to the local manufacturing plant, thousands of businesses and organizations nationwide are responsible for meeting strict environmental guidelines and reporting their compliance regularly to these state agencies. On a monthly, bimonthly, or quarterly basis, they send data in hard-copy reports with details such as their air emissions, pollutants discharged into bodies of water, hazardous waste disposal - any activities that could affect environmental integrity... A number of states and the EPA began envisioning a process whereby information from diverse platforms could be easily transferred through a network in a common XML format. States or the EPA could simply enter a query or make a request over the network and immediately retrieve their desired information, a process that would eliminate the need for staff to manually rekey or translate data into other formats. The primary environmental agencies for Nebraska, Utah, Delaware, and New Hampshire; the EPA; and an EPA contractor began exploring ways to simplify their data exchange. The group knew from studying industry trends that other organizations and the private sector were employing XML to accomplish cross-system data exchange... XAware offered ready-made, out-of-the-box data integration middleware. The XAware XA-Suite, a development environment and virtual XML server, provided an easy way to create bidirectional data exchange among diverse systems. Essentially, the technology aggregates data from multiple back-end systems into a single XML view that all systems can read..." See: (1) "Network Node Pilot Project Beta Phase: Report on Project Results and Next Steps."; (2) "EPA's Central Data Exchange (CDX) Supports XML-Based Reporting"; (3) general references in "Environmental Protection Agency (EPA) Central Data Exchange (CDX)." [alt URL]

  • [December 31, 2002] "XML Authoring and Editing Tools. Charasteristics and Limitations." By Neil Bradley (Rubus). In XML Journal Volume 3, Issue 12 (December 2002). ['In this article, you'll find an in-depth exploration of six distinct categories of XML document editors. With an appreciation of the characteristics and limitations of the tools available, you'll be able to choose those that best fit your needs.'] "I'll explore the following categories of XML editors: (1) Form-based editors; (2) Word processors; (3) Text editors; (4) Word processors with style mappings; (5) Word processors with XML enhancements; (6) XML word processors. In assessing the relative merits of each category I consider efficiency, ease of use, cost of ownership, and ability to access document model rules (defined in a DTD or schema definition) in order to control or guide the authoring process..." See: (1) Editing and Composition Tools in the list of Free XML Tools and Software (maintained by Lars Marius Garshol) and (2)'s list of reviewed XML editing tools. [alt URL]

  • [December 30, 2002] "Avoiding EDI's Mistakes With Web Services Semantic Interoperability." By David Burdett (Director, Product Management, Web Services, Commerce One). In EAI Journal Volume 4, Number 12 (December 2002), pages 8-11. ['EAI and B2B won't work without semantic interoperability. EDI had the right idea, but allowed too much flexibility and failed to achieve widespread use because it was too expensive to implement. With UBL, the lessons of EDI have at last been learned.'] Article overview: "(1) An analysis of the W3C Semantic Web activity for Web Resources and why it's not ideal technology for EDI/B2B; (2) Why EDI is the best starting point for building an ontology of definitions for business documents in eCommerce; (3) How having a combined ontology definition combined and canonical XML representation of that definition makes B2B integration easier; (4) A description of why a single definition for any business document with never work as you need to account for the different 'contexts' in which a document will be used, e.g., business, locale; (5) A description of why you need a base XML business document that everyone can use and understand combined with 'controlled' extensions to define the additional data needed for different contexts; (6) How UBL is providing a solution to the above, including its current state of development, support and migration to UN/CEFACT. [The UBL TC is] developing a set of business information entities, which are small components such as name and address, amount, or line item. These are then assembled into business document structures such as an order, invoice, etc. Currently, the PO document has been developed. By the end of the year, change order, simple order response (to accept or reject an order), and a complex order response (which allows responses at the line item level) will be added. The UBL library ultimately will also include invoice, dispatch advice, and goods receipt advice... UBL's reach is already broad, with formal liaisons set up with the following associations: ACORD (insurance), ARTS (retail sales), e.centre UK (FMCG), EIDX (electronics), HL7 (healthcare), NACS (convenience stores), RosettaNet (information technology), SWIFT (inter-banking), VCA (prescription eyewear), UN/EDIFACT and X12 (EDI), XBRL (accounting). Although UBL is making great progress, its deliverables aren't available for use. Also, the many different data formats used for integration won't go away overnight. This means that the requirement to understand semantics and map between different formats will remain a significant challenge for a while. So what should you do now? As mentioned earlier, point-to-point mapping between different systems simply doesn't scale. After about four implementations, it's quicker and results in lower maintenance costs, if you translate the structure of all input documents into a single, standard, intermediate format and then translate them to an output structure... Semantic interoperability is hard, but it's an absolute necessity. EAI and B2B won't work without it. EDI had the right idea, but allowed too much flexibility and failed to achieve widespread use because it was too expensive to implement. The lessons of EDI have been learned. You need to define semantics and the structure of messages. You need to provide controlled flexibility so business documents contain the same structures and definitions for data that applies in all business contexts while allowing data that's specific to a particular business context to be added in a controlled way. UBL is building on this EDI heritage and is bringing EDI into the world of XML. You can learn more online..." See: "Universal Business Language (UBL)."

  • [December 30, 2002] "Pattern Based Analysis of BPEL4WS." By Petia Wohed (Department of Computer and Systems Sciences, Stockholm University/The Royal Institute of Technology, Sweden), Wil M.P. van der Aalst (Department of Technology Management, Eindhoven University of Technology, The Netherlands), Marlon Dumas (Centre for Information Technology Innovation, Queensland University of Technology, Australia), and Arthur H.M. ter Hofstede (QUT). Technical Report FIT-TR-2002-04, QUT. 20 pages (with 17 references). Table 1 provides a comparison of BPEL4WS against XLANG, WSFL, Staffware, and MQSeries Workflow using both workflow and communication patterns. "Web services composition is an emerging paradigm for enabling application integration within and across organisational boundaries. A landscape of languages and techniques for web services composition has emerged and is continuously being enriched with new proposals from different vendors and coalitions. However, little or no effort has been dedicated to systematically evaluating the capabilities and limitations of these languages and techniques. The work reported in this paper is a first step in this direction. It presents an in-depth analysis of the Business Process Execution Language for Web Services (BPEL4WS). The framework used for this analysis is based on a collection of work ow and communication patterns... The reported analysis is based on a framework composed of a set of patterns: abstracted forms of recurring situations found at various stages of software development. Specifically, the framework brings together a set of workflow patterns documented in 'Workflow Patterns' available via the Workflow Patterns website [QUT Technical ReportFIT-TR-2002-2] and a set of communication patterns documented in [Ruh/Maginnis/Brown Enterprise Application Integration: A Wiley Tech Brief, 2001]. The workflow patterns (WPs) have been compiled from an analysis of existing workflow languages and they capture typical control ow dependencies encountered in workflow modelling. More than 12 commercial Workflow Management Systems (WFMS) as well as the UML Activity Diagrams, have been evaluated in terms of their support for these patterns. The WPs are arguably suitable for analysing languages for web services composition, since the situations they capture are also relevant in this domain. The Communication Patterns (CPs) on the other hand, are related to the way in which system modules interact in the context of Enterprise Application Integration (EAI). They are structured according to two dichotomies: synchronous vs. asynchronous, and point-to-point vs. multicast. They are arguably suitable for the analysis of the communication modelling abilities of web services composition languages, given the strong overlap between EAI and web services technologies..." See also "Pattern Based Analysis of BPML (and WSCI)" [TR] and "Don't Go With The Flow: Web Services Composition Standards Exposed. Web Services - Been There Done That?" [IEEE Intelligent Systems Jan/Feb 2003]. General references in "Business Process Execution Language for Web Services (BPEL4WS)." [source supplied by the authors]

  • [December 30, 2002] "Pattern Based Analysis of BPML (and WSCI)." By Wil M.P. van der Aalst, Marlon Dumas, Arthur H.M. ter Hofstede, and Petia Wohed. QUT Technical Report,FIT-TR-2002-05. Queensland University of Technology, Brisbane, 2002. 27 pages (with 24 references). "Web services composition is an emerging paradigm for enabling application integration within and across organizational boundaries. A landscape of languages and techniques for web services composition has emerged and is continuously being enriched with new proposals from different vendors and coalitions. However, little or no effort has been dedicated to systematically evaluating the capabilities and limitations of these languages and techniques. The work reported in this paper is a step in this direction. It presents an in-depth analysis of the Business Process modeling Language (BPML). The framework used for this analysis is based on a collection of workflow and communication patterns. In addition to BPML, the framework is also applied to the Web Services Choreography Interface (WSCI). WSCI and BPML have several routing constructs in common but aim at different levels of the web services stack... When comparing BPEL4WS, XLANG, WSFL, BPML and WSCI to contemporary workflow systems on the basis of the patterns discussed in this paper, they are remarkably strong. Note that only few workflow management systems support Cancel Activity, Cancel Case, Implicit Termination, and Deferred Choice. In addition, workflow management systems typically do not directly support message sending. The trade-off between block-structured languages and graph-based languages is shown [only partly reflected] by Table 6. XLANG, BPML, and WSCI are block-structured languages. WSFL is graph-based. BPEL4WS is a hybrid language in the sense that it combines features from both the block-structured language XLANG and the graph-based language WSFL. Nearly all workflow languages are graph-based and emphasize the need of end-users to understand and communicate process models. Therefore, it is remarkable that of the five languages evaluated in Table 6, only WSFL is graph based... All the five languages are textual (XML-based) without any graphical representation. This seems to indicate that communication of the models is not considered as a requirement. In this context, we refer to the BPMI initiative toward a Business Process Modeling Notation (BPMN). BPMN is intended as a graphical language that can be mapped onto languages such as BPML and BPEL4WS. Although not reflected by Table 6, the expressiveness of block-structured languages is limited to 'well-structured' processes where there is a one-to-one correspondence between splits and joins. This forces the designer using a language like BPML to introduce entities of type signal, raise, and synch which appear to be workarounds to emulate a graph-based language..." See also "Don't Go With The Flow: Web Services Composition Standards Exposed. Web Services - Been There Done That?" in IEEE Intelligent Systems Jan/Feb 2003. General references "Business Process Modeling Language (BPML)." [cache]

  • [December 30, 2002] "Indexing Open Schemas." By Neal Sample (Department of Computer Science, Stanford University, Stanford, CA, USA) and Moshe Shadmon (RightOrder Incorporated). Paper presented at Net.Object Days 2002 (NODe 2002), October 7-10, 2002, Messekongresszentrum, Erfurt, Germany. 20 pages (with 14 references). "Significant work has been done towards achieving the goal of placing semistructured data on an equal footing with relational data. While much attention has been paid to performance issues, far less work has been done to address one of the fundamental issues of semistructured data: schema evolution. Semistructured indexing and storage solutions tend to end where schema evolution begins. In practice, a real promise of semistructured data management will be realized where schemas evolve and change. In contrast to fixed schemas, we refer to schemas that grow and change as open schemas. This paper addresses the central complications associated with indexing open and evolving schemas: we specify the features and functionality that should be supported in order to handle evolving semistructured data. Specific contributions include a map of the steps for handling open schemas and an index for open schemas.. We map out the steps for handling open schemas, covering each of the major issues: key encoding, data interfaces, updating the data, and query support. These steps are generally applicable and may be interpreted broadly for any semistructured data format. We also present an index well suited for use with open schemas, the Index Fabric. The Index Fabric is a self-describing index. It is based on two cardinal advantages: the use of designators to store properties and structure per object's key, augmented by a compact, high-performance index structure. These features combine to make the Index Fabric an ideal substrate for semistructured and irregular data sources that change over time. Previous experiments demonstrated that the Index Fabric is a successful index for semistructured data, and the preliminary experiments presented here demonstrate that the Index Fabric is an effective index for data with an evolving schema..." [cache]

  • [December 30, 2002] "Consistency Checking of Financial Derivatives Transactions." By Daniel Dui (University College London, Department of Computer Science), Wolfgang Emmerich (Zuhlke Engineering Ltd), Christian Nentwich (University College London, Department of Computer Science), and Bryan Thal (UBS Warburg). Paper presented at Net.Object Days 2002 (NODe 2002), October 7-10, 2002, Messekongresszentrum, Erfurt, Germany. 18 pages (with 23 references). Web-Services Track. "Financial institutions are increasingly using XML as a de-facto standard to represent and exchange information about their products and services. Their aim is to process transactions quickly, cost-effectively, and with minimal human intervention. Due to the nature of the financial industry, inconsistencies inevitably appear throughout the lifetime of a financial transaction and their resolution introduces cost and time overheads. We give an overview of requirements for inconsistency detection in our particular domain of interest: the over-the-counter (OTC) financial derivatives sector. We propose a taxonomy for the classes of consistency constraints that occur in this domain and present how xlinkit, a generic technology for managing the consistency of distributed documents, can be used to specify consistency constraints and detect transaction inconsistencies. We present the result of an evaluation where xlinkit has been used to specify the evaluation rules for version 1.0 of the Financial Products Markup Language (FpML)... The success of the evaluation has led the FpML steering committee to consider a proposal to use xlinkit as the standard language to express validation constraints for FpML documents. The high level of abstraction provided by xlinkit makes it particularly suitable. Moreover, FpML will use the xlinkit constraint checker as the reference implementation for an FpML validation engine against which the industry can compare their proprietary implementations. Finally, we believe that beyond derivative trading, xlinkit has many potential application areas. Problems similar to the ones described in this paper occur whenever semi-structured information is exchanged between organizations. This is the case in e-commerce and electronic business. For example, we have also investigated the use of xlinkit in procurement processes, such as those standardized by RosettaNet. Moreover, we have successfully applied xlinkit to detecting inconsistencies in software engineering documents..." See "Financial Products Markup Language (FpML)." [cache]

  • [December 30, 2002] "A Dependency Markup Language for Web Services." By Robert Tolksdorf (Freie Universität Berlin, AG Netzbasierte Informationssysteme) [WWW]. Paper presented at Net.Object Days 2002 (NODe 2002), October 7-10, 2002, Messekongresszentrum, Erfurt, Germany. 11 pages. Compares WSFL, XLANG, WSCL, DAML-S, ASDL, WSMF, BPSS, BPML. "Current mechanisms for the description of Web Services and their composition are either to coarse (by specifying a functional interface only) or too fine (by specifying a concrete control flow amongst services). We argue that more adequate specifications can be built on the notion of dependency of activities and coordination activities to manage these dependencies. We propose a Dependency Markup Language to capture dependencies amongst activities and generalizations/specializations amongst processes. With that, we can describe composite services at more suited levels of abstraction and have several options to use such descriptions for service coordination, service discovery and service classification... The DML specification is currently being finalized. On this basis, a coordination environment is to be build and tested. This includes the implementation of a set of coordination mechanisms, their binding to dependencies and the generation of control flows from DML. This also include the implementation of dependency matchers as described. The dependency ontology has to be enlarged by further studies. It has be checked how notions of dependencies can be related by specialization to lead to a unified hierarchy. It has to be tested how the mentioned mechanisms for automatic classification of processes can be applied for DML specifications and whether they lead to useful results. Furthermore, the currently not considered multiparty dependencies have to be explored..." See also the presentation slides and the associated website.

  • [December 30, 2002] "E-Business Process Modeling: The Next Big Step." By Selim Aissi, Pallavi Malu, and Krishnamurthy Srinivasan (Intel Labs). IEEE Computer Volume 35, Number 5 (May 2002), pages 55-62. ['The authors propose a Process Coordination Framework for Web services and outline the building blocks required for e-business automation. Their framework helps in understanding the roles of various standards and in identifying overlaps, gaps, and opportunities for convergence.'] "The Internet is emerging as the platform for automating both the provider and consumer ends of e-business transactions. In this new model, businesses offer Web services that applications running in other businesses could invoke automatically, building bridges between systems that otherwise would require extensive integration and development efforts. Web services are software components that use standard Internet technologies to interact with one another dynamically. In spite of the significant potential benefits, only a few large businesses have implemented automated e-business so far. Businesses realize that the cost of automating transactions with trading partners is very high. Standards and technologies for modeling e-business processes that use Web services could drive the costs down by achieving automated code generation, reuse, and interoperability -- facilitating communication between business analysts and implementers. Our proposed Process Coordination Framework outlines the building blocks required for Web services-enabled e-business automation. PCF helps in understanding the roles of the various proposed standards with respect to these building blocks and in identifying both overlaps and gaps... Having multiple specifications that overlap and describe similar features can cause interoperability problems. Instead of describing its own transport bindings, CPPA could use WSDL's transport bindings. Both XLANG and BPML focus on an XML-based description of private business processes. Because they both are built on WSDL and have relative gaps and strengths, convergence would capitalize on their strengths. WSCL describes the sequence in which operations can be invoked. WSFL specifies public choreography and combining multiple Web services into a composite Web service. BPSS describes public choreography and multiparty collaboration. WSCL, WSFL, and BPSS all describe the public choreography of a business process, and they should converge on that feature. In a perfect world, the analyst draws the model, the tools generate the WSDL and WSCL or BPSS, and the implementer uses the graphical model to derive the workflow implementation and automatically generate the CPP, which the trading partners then use to do business. While XML allows interoperability, the proposed automation process will facilitate reuse of the workflows. Web services are advancing platform and language- independent interoperability. However, their standardized definitions are still evolving. Describing the Web services business process is a key area in the future of software engineering. In addition to discovery and management, a Web services description will lead to solutions that are more flexible, faster to implement, and cheaper to deploy and maintain..."

  • [December 30, 2002] "A Formal Service Specification for the Internet Open Trading Protocol." By Chun Ouyang, Lars Michael Kristensen, and Jonathan Billington (Computer Systems Engineering Centre, School of Electrical and Information Engineering, University of South Australia Mawson Lakes Campus, SA 5095, Australia). Paper presented at the 23rd International Conference on the Applications and Theory of Petri Nets (ICATPN 2002), Adelaide, Australia, June 24-30, 2002. "This paper presents our service specification for the Internet Open Trading Protocol (IOTP) developed using Coloured Petri Nets. To handle IOTP's complexity, we apply a protocol engineering methodology based on Open Systems Interconnection (OSI) principles consisting of five iterative steps: the definition of service primitives and parameters; the creation of an automaton specifying the local service language for each of the four trading roles of IOTP; the development of a CPN model synthesizing the local automata into a specification of the global service capturing the correlations between the service primitives at the distributed trading roles; the generation of the occurrence graph representing the global service language; and lastly a new step, language comparison to ensure the consistency between the specifications of the local service language and the global service language. The outcome is a proposed formal service specification for IOTP..." See: (1) Internet Open Trading Protocol Working Group; (2) "Internet Open Trading Protocol (IOTP)."

  • [December 28, 2002] "XML for 2003." By Simon St. Laurent. Published on The O'Reilly Network (December 28, 2002). "2003 is promising to be the most exciting year the XML world has seen since those halcyon days of 1998 and 1999, as substance fills in the space behind all the promises. The biggest upcoming news is likely to grow from Microsoft's recent announcements at the XML 2002 conference. Making it possible to create, process, and analyze XML documents using the Microsoft Office set of tools both opens Office to new possibilities and gives XML developers a whole new set of priorities for the kinds of information they can expect to see. At the same time, it was a delight to see Rick Jelliffe of Topologi showing off the Topologi Markup Editor directly opposite the booth for Microsoft Word. To some users, Topologi's close-to-the-markup editor may seem like a throwback to days they'd rather not experience, but for other users (myself included) this set of precision machine tools for markup is a dream come true. There's room for both kinds of tool in the XML universe, and it seems likely that Topologi's editor will get a lot of use exploring and tweaking documents originally created in Office. On the Web side, SVG and XForms are both worth close examination..." Author's note: "I've put up an 'XML for 2003' piece on my O'Reilly Network blog. XML 2003 got me genuinely excited about XML's future prospects for the first time in a few years. Business prospects have always been okay, but the technology prospects look more compelling to me. Anyone who thinks I've missed something or overestimated something is welcome to complain here or there. The quick list is: (1) Microsoft Office and XML, (2) Topologi, (3) SVG & XForms, (4) RELAX NG & DSDL, (5) XQuery? ... Also, I have a sneaking suspicion that Trang is going to creep up on all of us, and save us..."

  • [December 27, 2002] "Introducing XPath 2.0." By Aaron Skonnard [WWW]. In MSDN Magazine (January 2003). The XML Files. "Although XPath 1.0 simplified many common programming tasks, developers were left wanting more. The XPath 1.0 specification is limited or confusing in several areas and is in need of a facelift. The W3C has been pushing to add more significant features to the language, mostly in support of the other evolving W3C XML specifications (like XML Schema, XML Query 1.0, and XSLT 2.0)... The most profound change in XPath 2.0 can be found deep inside its data model. The details of the XPath 2.0 data model are found in the XQuery 1.0 and XPath 2.0 Data Model specifications, which again is shared by XPath 2.0, XSLT 2.0, and XQuery 1.0. The XPath 2.0 data model is based on the Infoset with the necessary extensions to support XML Schema. The data model defines seven node types that make up the structure of an XML document: document (root), element, text, attribute, namespace, processing instruction, and comment nodes. The node types are similar to those found in the XPath 1.0 data model, but in the case of elements and attributes they've been extended to include XML Schema type information after validation occurs. The resulting "typed" data model is referred to as the Post Schema-Validation Infoset (PSVI). In addition to the basic node types, XPath 1.0 defines four atomic types to assist in processing text values found in elements or attributes: node-set, string, number, and Boolean... XPath 2.0 is positioned center-stage in the family of XML specifications. It serves as the official addressing language to be used by other specifications, like XQuery 1.0, XSLT 2.0, and potentially many others. XPath 2.0 improves usability while increasing functionality and maintaining backward compatibility as much as possible. It also adds support for XML Schema, which promises added value to developers working in strongly typed worlds. At this point, it wouldn't be proper to leave out the fact that XPath 2.0 has its fair share of opponents. There are many XML developers (especially those deep in the trenches of XSLT) that are confused by the increased complexity of XPath 2.0. Their biggest complaint has to do with requiring support for XML Schema instead of making it an additional, optional layer. This long-running XPath 2.0 debate has recently boiled over into some pretty heated political discussions. Developers working heavily with databases or Web Services will undoubtedly find the benefit worth the price. But I do feel sympathy for the XSLT wonks who want the new-and-improved XPath without having strong-typing forced upon them. How this will ultimately shake out is hard to tell. If you have an opinion on the matter, or would like to share any other comments related to the language, send them to the public W3C XPath/XQuery mailing list..." See XML Path Language (XPath) 2.0 (W3C Working Draft 15-November-2002).

  • [December 27, 2002] "XML Based Schema Definition for Support of Inter-organizational Workflow." By W.M.P. van der Aalst (College of Business and Administration, University of Colorado; Faculty of Technology and Management, Eindhoven University of Technology) and Akhil Kumar (Database Systems Research Department, Bell Laboratories). October 18, 2002. 39 pages. General Introduction to XRL, with 53 references on workflow. "The full potential of the web as a medium for electronic commerce can be realized only when multiple partners in a supply chain can route information among themselves in a seamless way. Commerce on the Internet is still far from being "friction-free" because business partners cannot exchange information about their business processes in an automated manner. In this paper, we propose the design for an eXchangeable Routing Language (XRL) using XML syntax. XML (Extensible Markup Language) is a means for trading partners to exchange business data electronically. The novel contribution of our work is to show how XML can also be used to describe workflow process schemas to support flexible routing of documents in the Internet environment. The design of XRL is grounded in Petri nets, which is a wellknown formalism. By using this formalism, it is possible to analyze correctness and performance of workflows described in XRL. Architectures to facilitate inter-operation through loose and tight integration are also discussed. Examples illustrate how this approach can be used for implementing interorganizational electronic commerce applications. As a proof of concept we have also developed XRL/flower, a prototype implementation of a workflow management system based on XRL..." See: (1) "Exchangeable Routing Language (XRL)"; (2) "XML and Petri Nets." [cache]

  • [December 27, 2002] "XRL/Flower: Supporting Inter-Organizational Workflows Using XML/Petri-net Technology." By H.M.W. Verbeek, A. Hirnschall, and W.M.P. van der Aalst (Faculty of Technology Management, Eindhoven University of Technology, The Netherlands). Paper presented at Workshop on "Web Services, e-Business, and the Semantic Web (WES): Foundations, Models, Architecture, Engineering and Applications", held in conjunction with CAiSE 2002 [The Fourteenth International Conference on Advanced Information Systems Engineering , May 27-28, 2002, Toronto, Ontario, Canada]. 16 pages (Workshop proceedings pages 535-552). "In this paper, we present the architecture of XRL/Flower. XRL/Flower is a software tool which benefits from the fact that it is based on both XML and Petri nets. Standard XML tools can be deployed to parse, check, and handle XRL documents. The Petri-net representation allows for a straightforward and succinct implementation of the workflow engine. XRL constructs are automatically translated into Petri-net constructs. On the one hand, this allows for an efficient implementation. On the other hand, the system is easy to extend: For supporting a new routing primitive, only the translation to the Petrinet engine needs to be added and the engine itself does not need to change. Last, but not least, the Petri net representation can be analyzed using state-of-the-art analysis techniques and tools." See: (1) "Exchangeable Routing Language (XRL)"; (2) "XML and Petri Nets." [alt source]

  • [December 26, 2002] "What MDDL Can Do For You." By James Hartley (Chief Technologist, FISD). In Inside Market Data (December 02, 2002). ['As XML takes hold in the financial information industry, Market Data Definition Language (MDDL) is ready to carry the information from producers to consumers -- in a form the sender can easily generate and the recipient can understand.'] "MDDL is the XML specification to enable the interchange of information necessary to account, analyze and trade financial instruments of the world's markets. An MDDL committee -- made up of user firms, vendors and exchanges and coordinated via the Financial Information Services Division (FISD) of the Software & Information Industry Association -- has jointly developed the language. MDDL does not define a protocol per se for communication or transactions, but it does provide a universal way to explicitly identify market data. The specification encompasses all asset classes and supporting data, from equities and bonds to foreign exchange, futures and corporate actions. MDDL 1.0 defined common equities (ordinaries), mutual funds and indices. Version 2.0, available in beta form, adds setup information for bonds. Version 2.1, due the first quarter of 2003, will fill out equities and bonds while adding futures, options, and corporate actions. In addition, a query mechanism -- and a full suite of examples -- will be added over the next quarter. Additional features are added as FISD or MDDL members require it. MDDL focuses on the accuracy of the reporting of market data -- clearly specifying the content, its source and its relationship to other relevant facts. An MDDL document is an encapsulation of market data to describe the current status, valuation or defining parameters of an instrument at a specific instance in its lifetime. With a common encapsulation structure like MDDL, the user firm can have a standards- based infrastructure that is easily expanded. At least three major global brokerage/ investment houses have chosen XML to solve their internal problem. They expect to realize a cost savings over current practices by transforming all incoming data into a common internal XML framework. The new infrastructure permits the dissemination of wrapped existing content while permitting the introduction of new data in a phased approach. The data will be converted and inserted when available. No change to the infrastructure is necessary, and the end user application can be upgraded when convenient..." See: "Market Data Definition Language (MDDL)."

  • [December 26, 2002] "Safeguard Your XML-Based Messages: Create Secure Web Services With Apache XML Security." By Tarak Modi. In Java World (December 20, 2002). With source code. ['Apache XML Security is an open source implementation of the XML Digital Signature specification that allows you to digitally sign your Web service messages. Digital signatures assure your messages' receivers that the messages are really from you. After reading this article, which serves as an introductory tutorial to Apache XML Security, you will be well prepared to start signing your Web services messages.'] "Web services are here to stay, but if you are like most software developers, you worry about the plaintext SOAP (Simple Object Access Protocol) messages being exchanged over the Web. Web services security is a hot topic today because the success of this exciting technology hinges directly upon how secure we can make it. To that end, the World Wide Web Consortium (W3C) has defined the XML Signature and XML Encryption specifications for digitally signing and encrypting XML-based communication messages, such as the SOAP messages used in Web services. Furthermore, companies such as IBM, Microsoft, and VeriSign have partnered to provide additional specifications, such as WS-Security (Web Services Security), that build upon these W3C specifications. And who hasn't heard of the Liberty Alliance Project, a consortium of companies led by Sun Microsystems to provide a standards-based single sign-on solution to Web services? In the midst of all these initiatives lies the Apache XML Security project, an open source project that currently implements the W3C XML Signature specification and will soon support the XML Encryption specification. This article serves as a tutorial to get you up to speed with this outstanding implementation... [the article gives a] whirlwind tour of Apache XML Security and puts it in the context of a real-world example by discussing a use of this library with Axis. By using Apache XML Security with Axis to sign SOAP messages, you gain three of the four secure messaging characteristics mentioned above: namely authentication of origin, nonrepudiation, and message integrity. To gain the fourth characteristic of privacy, you must use XML Encryption. Soon Apache XML Security will support that as well. Also, keep an eye out for announcements related to Java Specification Requests (JSRs) 105 and 106, which the Java Community Process (JCP) is currently working on. The proposed final draft of JSR 105, XML Digital Signature APIs, will define a standard set of APIs for XML digital signatures services. And for those worried developers out there, yes, Apache is on JSR 105's expert group. JSR 106, XML Digital Encryption APIs, will define a standard set of APIs for XML digital encryption services..." See Apache XML Security Project. Supported specifications include: (1) XML-Signature Syntax and Processing, (2) Exclusive XML Canonicalization Version 1.0, and (3) XML-Signature XPath Filter 2.0. See also JSR 105 (XML Digital Signature APIs) and JSR 105 (XML Digital Encryption APIs).

  • [December 26, 2002] "A Registry of Assignments Using Ubiquitous Technologies and Careful Policies." By Daniel W. Connolly (World Wide Web Consortium) and Mark Baker. IETF Network Working Group, Internet-Draft. December 20, 2002, expires June 20, 2003. Reference: 'draft-connolly-w3c-accessible-registries-00'. Abstract: "Registries are a critical part of many of the systems which inhabit the Internet today. Unfortunately, not a lot of the assignments within those registries are as accessible as they could be. This document outlines a best current practice approach to improving the accessibility of assignments within registries using ubiquitous technologies and low cost supporting policies." From the Introduction and Proposed Solution: "The success of the World Wide Web project has demonstrated the value of dereferencable URIs, and linking information together using them. Many registries on the Internet today create assignments whose principle identifier is not a URI, preventing the information associated with that assignment from being integrated with Web tools such as search engines, and from being related to other information, for example linking the assignment of the "application/xml" Internet media type to RFC 3023. On occasion, where an assignment has been provided a URI, it is not done so authoritatively, reducing the value of the URI as an identifier, as references to the assignment are split between the URI and the non-URI form. For example, the Internet media type "application/pdf" could also be identified by the URI "", such that new protocols could use the URI instead of "application/pdf". Anybody inspecting a message would be able to dereference the URI to discover that it was in fact a media type, rather than simply a string that happens to resemble one... As the title of this note intimates, we suggest that a combination of the use of currently ubiquitous technologies, together with the implementation of careful policies, comprises a practical and long- term viable solution to this problem. The basis of our proposed solution is for registries to use dereferencable URIs, in particular http: scheme URIs, as the principle identifier syntax for assignments. When dereferenced, these URIs should return at least some human processable information describing the assignment; the requestor, the details of the assignment request, URIs of relevant specifications, etc..." [cache]

  • [December 26, 2002] "Global XML Standards for Tax Matters Will Benefit Developers." By Charles Abrams and Andrea Di Maio (Gartner Research). Gartner FirstTake Report. Reference: FT-18-9905. 13-December-2002. ['The Internet standards group OASIS has set up a technical committee to develop international XML specifications for taxation matters. This initiative should help developers add value to a wide range of applications.'] "On 9 December 2002, the Organization for the Advancement of Structured Information Standards (OASIS) announced the Tax XML Technical Committee to develop standard vocabularies to cover corporate and personal taxation matters. OASIS said the committee would have the backing of tax authorities in Canada, Germany, the Netherlands, the United Kingdom and the United States as well as leading IT companies, including IBM, Oracle and SAP... Many governments now accept tax filings submitted electronically using government designed forms and templates. However, there are no standard definitions, formats and schemas that governments and businesses can use when creating, sending, receiving, maintaining, storing and retrieving data in documents relevant to the life cycle of a tax case. Any one of the tax authorities might have developed its own standards, but an international set will save duplicating work. Tax XML standards will ultimately make enterprises more interested in extended tax and financial supply chains, supported by real-time tax information that can be called and managed in real-time government applications. Thus, all parties should benefit: (1) Having standard definitions for taxation terms that apply across international communities and multiple languages should enable software vendors to harmonize their offerings for many markets; (2) Tax authorities should have more choice of packages; (3) Intermediaries and tax professionals should be able to exchange tax-related data in a more coherent way; (4) Taxation matters in multinational enterprises should become more amenable to computer processing both by enterprises and authorities. The announcement of the technical committee on taxation comes shortly after the formation of a broader, e-Government Technical Committee within OASIS. This committee has the wide responsibility of identifying and organizing plans for new standards. Both committees show that governments have started to realize the potential for standards within the context of delivering Web services. Governments, tax professionals, enterprises and others should watch developments in these committees... by the end of 2007, more than half the tax filings (by revenue) in countries in the Organization for Economic Cooperation and Development will use Tax XML or other XML-based standards..." Also in HTML format. See: (1) details in "OASIS Members Form Tax XML Technical Committee"; (2) general references in "XML Markup Languages for Tax Information."

  • [December 26, 2002] "An Introduction to VRXML." December 17, 2002. By Bryan Palmer (Gemini Systems, Inc). 21 slides. "VRXML is designed to address a major disconnect between billing and reporting. There are Terminology/Vocabulary issues, Customer service issues, and Tri-Party reconciliation issues (vendor/exchange/subscriber -- inaccurate reporting based on time differences; intensive reconciliation problems). Why create yet another Reporting Format? (1) VRXML has the potential to turn over 100 different reporting formats into one standard format. (2) Extensibility: the ability to add additional functionality or modify existing functionality without impacting the existing system functionality..." [adapted] Note: On December 23, 2002 Gemini Systems announced a 'VRXML FastTrack Program' to support "the reporting of inventory and other administrative data between exchanges, vendors and subscribers; the VRXML FastTrack Program is packaged to help organizations quickly realize the benefits and positive ROI of VRXML by adapting their systems and operations to take advantage of this powerful new communications standard." See general references in "Vendor Reporting Extensible Markup Language (VRXML)."

  • [December 26, 2002] "What is RSS?" By Mark Pilgrim. From December 18, 2002. ['The aim of "Dive into XML" is to enable programmers to get started working, and having fun, with various areas of XML. In his inaugural column Mark turns his attention to RSS, a metadata syndication format. He presents guidelines and code to get you started working with this widespread XML application.'] "RSS is a format for syndicating news and the content of news-like sites, including major news sites like Wired, news-oriented community sites like Slashdot, and personal weblogs. But it's not just for news. Pretty much anything that can be broken down into discrete items can be syndicated via RSS: the 'recent changes' page of a wiki, a changelog of CVS checkins, even the revision history of a book. Once information about each item is in RSS format, an RSS-aware program can check the feed for changes and react to the changes in an appropriate way. RSS-aware programs called news aggregators are popular in the weblogging community. Many weblogs make content available in RSS. A news aggregator can help you keep up with all your favorite weblogs by checking their RSS feeds and displaying new items from each of them... But coders beware. The name 'RSS' is an umbrella term for a format that spans several different versions of at least two different (but parallel) formats. The original RSS, version 0.90, was designed by Netscape as a format for building portals of headlines to mainstream news sites. It was deemed overly complex for its goals; a simpler version, 0.91, was proposed and subsequently dropped when Netscape lost interest in the portal-making business. But 0.91 was picked up by another vendor, UserLand Software, which intended to use it as the basis of its weblogging products and other web-based writing software. In the meantime, a third, non-commercial group split off and designed a new format based on what they perceived as the original guiding principles of RSS 0.90 (before it got simplified into 0.91). This format, which is based on RDF, is called RSS 1.0. But UserLand was not involved in designing this new format, and, as an advocate of simplifying 0.90, it was not happy when RSS 1.0 was announced. Instead of accepting RSS 1.0, UserLand continued to evolve the 0.9x branch, through versions 0.92, 0.93, 0.94, and finally 2.0. What a mess... [There are seven] different formats, all called 'RSS'. As a coder of RSS-aware programs, you'll need to be liberal enough to handle all the variations. But as a content producer who wants to make your content available via syndication, which format should you choose?..." See the book Content Syndication with RSS, by Ben Hammersley (ISBN: 0-596-00383-8). See general references in "RDF Site Summary (RSS)."

  • [December 26, 2002] "Reports From XML 2002." By Eric van der Vlist. From December 18, 2002. ['Eric van der Vlist has compiled a series of reports from the XML 2002 conference in Baltimore, covering topics from show-stealer Microsoft Office 11 to schemas and literate programming.'] Covers: (1) "Microsoft Office Embraces XML" - "For many participants, the most memorable event of XML 2002 will be Jean Paoli's presentation of Office 11, which promises to deliver easier access to XML for hundreds of millions of workstations..." (2) "OpenOffice: The XML format for the masses" - "Jean Paoli for Microsoft and Daniel Vogelheim for OpenOffice both chose the same title "XML for the masses" for their presentations, a commonality which hides two very different approaches from the editors of two competing office productivity suites..." (3) "ISO DSDL on the move" - "Document Schema Definition Languages (DSDL) is a project of the ISO/IEC JTC 1/SC34/WG1 working group chaired by Charles Goldfarb. After the meetings held in Baltimore during XML 2002, this working group has published a set of recommendations which have been all approved..." (4) "A burst of schemas" - "For different reasons many XML 2002 presentations proposed the use of multiple validations and transformations for advanced needs, rather than using a single schema considered too complex or even impossible to write and maintain..." (5) "Impossible, Except For James Clark" - "The fact that the translation of Relax NG schemas into W3C XML Schema was considered impossible wasn't a good enough reason for James Clark, who presented at XML 2002 the latest progress of Trang, his multi-format schema converter..."

  • [December 26, 2002] "From XML-RPC to SOAP: A Migration Guide." By Rich Salz. From December 18, 2002. ['This month's installment from Rich Salz, our resident Web Services commentator, tells users of XML-RPC it's about time they started using SOAP, and provides a migration path to move between the two Web services protocols. As usual, Rich provides plenty of code and examples.'] "As you might expect from the name, XML-RPC is a way of using XML to send classic Remote Procedure Calls (RPC) over the net. XML-RPC's use of XML is very simple. It doesn't use namespaces. It doesn't even use attributes. Poking at the reasons behind technology standards can lead to interesting results. The really good ones, like the first ANSI C specification, include a detailed rationale for key decisions. Most internet standards don't spend the effort but prefer, instead, to allow an archived mailing list to act as a primary source. And, even then, it can be fun to play standards archaeologist. The primary motivator behind XML-RPC is Dave Winer, owner of a small ISV company, and one of the first webloggers. Winer was part of the initial, self-selected group that created SOAP. According to Don Box's Brief History of SOAP, Winer grew impatient with corporate delays. In July, 1998, Winer posted the specification for XML-RPC. At that point, the Namespaces in XML group had published two drafts; final recommendation publication wouldn't happen until January, 1999. Given the state of flux, and all the churn caused by following XML Schema drafts which Don describes in his article, it isn't surprising that XML-RPC avoids namespaces altogether. It is surprising, however, that XML-RPC doesn't use XML attributes. One might surmise that doing without attributes makes parsing much simpler. XML-RPC is well-suited to simple lexical handling: divide the input stream into 'tags', which are simple words surrounded by angle brackets, and values. In hindsight, however, it makes for strange XML... With a few simple changes, XML-RPC applications should be able to send and receive SOAP messages to and from well-behaved SOAP applications. Perhaps surprisingly, most of the work is in constraining the SOAP applications to be well-behaved. In the standards world, when a specification has many options and possibilities, and you define yourself to a conformant subset, that's called a profile. So what we've started work on is a SOAP profile, defined in WSDL, that should make it easy for 'legacy' XML-RPC applications to interoperate..." See: (1) "Simple Object Access Protocol (SOAP)"; (2) "XML-RPC."

  • [December 26, 2002] "Generating XML and HTML Using XQuery." By Per Bothner. From December 18, 2002. ['Often perceived mainly as a query language, XQuery can actually be used to generate XML and HTML. Per Bothner provides a worked example, and compares XQuery with XSLT.'] "XQuery is often described as a query language for processing XML databases. But it's also a very nice language for generating XML and HTML, including web pages. In this article we will look at XQuery from this angle. There are many tools for generating web pages. Many of these are based on templates. You write an HTML page, but you can embed within it expressions that get calculated by the web server... In order to demonstrate web page generation with XQuery, I will describe a photo album application. There are lots of such applications around, and while they differ in features, they all have the same basic idea. You throw a bunch of digital images (JPEG files) at the application, and it generates a bunch of web pages. The overview page shows many smaller thumbnail images; if you click on one, you get a bigger version of that image... if you have a task that matches XSLT's strength, by all means use XSLT. However, if you have a task that is a mix of XSLT-style transformation combined with some control logic, consider using XQuery, even for the part of the task that is XSLT-like. The advantage of XSLT over XQuery in applications best suited for XSLT is relatively minor, while the pain of writing more complex logic in XSLT instead of XQuery is considerable. The photo album is an application I first wrote in XSLT, but I was able to easily make significant improvements when I rewrote it in XQuery..." See "XML and Query Languages."

  • [December 26, 2002] "A Data Model for Strongly Typed XML." By Dare Obasanjo. From December 18, 2002. ['Dare Obasanjo has been searching for an appropriate data model for typed XML. In his article he examines the Post Schema-Validation Infoset, but settles on the XPath/XQuery data model as the best candidate for applications that want to use a strongly typed data model for XML.'] "In many XML applications, the producers and consumers of XML documents are aware of the datatypes within those documents. Such applications can benefit from manipulating XML via a data model that presents a strongly typed view of the document. Although a number of abstractions exist for manipulating XML -- the XML DOM, XML infoset, and XPath data model -- none of these views of XML take into account usage scenarios involving strongly typed XML. Many developers utilize XML in situations where type information is known at design or compile time, including interacting with relational databases and strongly typed programming languages like Java and C#. Thus, there is a significant proportion of the XML developer community which would benefit from a data model that encouraged looking at XML as typed data. This article is about my search for and discovery of this data model. The W3C XML Information Set recommendation describes an abstract representation of an XML document. The XML Infoset is primarily meant to act as a set of definitions used by XML technologies to formally describe what parts of an XML document they operate on. Several W3C XML technologies are described in terms of the XML Infoset, including SOAP 1.2, XML Schema, and XQuery. An XML document's information set consists of a number of information items. An information item is an abstract representation of a component of an XML document: such as an element, attribute or processing instruction. Each information item has a set of associated named properties. Each property is either a collection of related information items or data about the information item; the [children] property of an element information item is an example of the former, while the [base URI] of a document information item is an example of the latter. An XML document's information set must contain a document information item from which all other information items belonging to the document can be accessed. The XML Infoset is a tree-based hierarchical representation of an XML document... The XQuery and XPath 2.0 Data Model is still a working draft; some of its details may change before it becomes a W3C recommendation. However, the core ideas behind the data model which this article explores are unlikely to change. This article is based on the November 15th draft of the working draft. The XQuery and XPath 2.0 data model presents itself as a viable data model for processing XML in strongly typed usage scenarios. The loose coupling to the W3C XML Schema type system is especially beneficial because it both provides an interoperable set of types, yet does not limit one to solely those types. The XQuery and XPath 2.0 data model stands out as the most credible data model for dealing with XML in strongly typed scenarios. Given that the XQuery and XPath 2.0 data is based on the XML Infoset, and also builds upon the past experience with XPath 1.0, it's the best candidate for the Data Model for XML..." For schema description and references, see "XML Schemas."

  • [December 24, 2002] "DALF Guidelines for the Description and Encoding of Modern Correspondence Material." By Edward Vanhoutte and Ron Van den Branden. Version 1.0. Discussion Draft. Centrum voor Teksteditie en Bronnenstudie [Centre for Scholarly Editing and Document Studies], Royal Academy for Dutch Language and Literature, Gent, Belgium. "In view of its assignment to study and valorize the Flemish literary and musical heritage, the Centre for Scholarly Editing and Document Studies (Centrum voor Teksteditie en Bronnenstudie - CTB) has launched the DALF project. DALF is an acronym for "Digital Archive of Letters by Flemish authors and composers from the 19th & 20th century. It is envisioned as a growing textbase of correspondence material which can generate different products for both academia and a wider audience, and thus provide a tool for diverse research disciplines ranging from literary criticism to historical, diachronic, synchronic, and sociolinguistic research. The input of this textbase will consist of the materials produced in separate electronic edition projects. The DALF project can be expected to stimulate new electronic edition projects, as well as the international debate on electronic editions of manuscripts. In order to ensure maximum flexibility and (re)usability of each of the electronic DALF editions, a formal framework is required that can guarantee uniform integration of new projects in the DALF project. Therefore, the project is from the start aimed at adherence to international standards for electronic text encoding. An important formal standard used in the DALF project is XML, that enables the definition of structural text-grammars as Document Type Definitions (DTD). Also in the construction of such a DTD that is suitable for scientific markup of correspondence material, we tried to align with international efforts to define markup schemes. Without going into detail here, the insights and practices presented in international projects like TEI (Text Encoding Initiative), Master (Manuscript Access through Standards for Electronic Records), and MEP (Model Editions Partnership) were taken into consideration for the implementation of following requirements in a DTD for correspondence material..." See: (1) the announcement "DALF Guidelines and DTD Discussion Draft Version."; (2) related references in "Text Encoding Initiative (TEI)."

  • [December 24, 2002] "Challenges and Solutions for Leveraging RIXML. Application of XAware Technology." XAware White Paper. December 17, 2002. 10 pages. Contact: Jojy Mathew (VP, Strategic Development & Product Strategy, XAware, Inc). "As technology flourishes, information overload at buy- and sell-side financial services firms has reached critical levels. Without a standard to classify, sort, filter, manage, and distribute available research data, investment firms cannot operate effectively. RIXML (Research Information Exchange Markup Language) was developed to enable firms to access and use the information they need on a daily basis, to assist their decision-making process, act upon market fluctuations, implement corporate strategies, and communicate with internal and external clients. Wide-scale industry adoption and implementation of the RIXML classification methodology and specification is critical to future success of the standard. Moreover, success is contingent upon firms' ability to rapidly incorporate the classification methodology (e.g., enable their systems to work within the RIXML classifications and effectively 'read' RIXML-tagged external research data), and apply the appropriate tagging to their own internally generated content. By implementing RIXML, firms will benefit from increased efficiency of research distribution, more effective targeting, personalization, and overall usability. XAware has developed integration components which financial services firms, system integrators, and software vendors can leverage to manage and integrate data and technology from multiple vendors. The XAware solution enables common information models leveraging XMLbased standards for interoperability, and is an engine for compliance with the RIXML standard. XAware's ready-to-integrate software components can expedite the implementation of RIXML. XAware's components incorporate a 'building block' approach, enabling firms to rapidly connect to financial services firms' existing framework, and subsequently integrate more and more data into the RIXML standard. XAware's components are designed to play a central role in RIXML implementation, an object-oriented, messaging interface that links distributed management applications..." See: (1) the announcement "XAware Announces RIXML Components for Financial Services Integration. Integration Building Blocks Enable Financial Services Firms, System Integrators and Software Vendors to Rapidly Adopt RIXML."; (2) general references in "Research Information Exchange Markup Language (RIXML)." [cache]

  • [December 23, 2002] "E-Learning Lessons Learned. Standards Reduce Compatibility Problems." By Gail Repsher Emery. In Government Computer News Volume 21, Number 34 (December 16, 2002), pages 29-31. "... The Defense Department-inspired SCORM e-learning specification seems to be taking off... Federal requests for proposals increasingly ask whether electronic-learning products and services comply with the Sharable Content Object Reference Model, which lets content be shared and reused on multiple learning management systems, [IBM's David] Grebow said. 'With SCORM you can input content once and publish it in print, digitally, to a handheld device, to audio, he said. Content becomes usable by many people in many ways and places. That's when e-learning really takes off. SCORM's specifications were adapted from many sources. The first version of SCORM came out in 2000, and Version 1.3 is in development by the DOD-funded Advanced Distributed Learning Initiative. Version 1.3 will permit reordering of material based on factors such as student performance. DOD established the ADL Initiative in 1997 to standardize government, industry and academic e-learning specs. There are now three ADL Co-Laboratories for cooperative research, development and assessment of new learning technology. The labs are funded for fiscal 2003 with $14 million from DOD, said Bob Wisher, director of the initiative... Most proprietary e-learning systems can't talk to each other. But SCORM buyers needn't worry that their investments in courseware and learning management systems will be incompatible, said Wisher, who works at the ADL Co-Lab in Alexandria, Va. Also, vendors that conform to the spec can reach a larger market, he said. SCORM has increased government business for VCampus Corp. of Reston, Va., company officials said. 'SCORM lets us find other standardized content and grow our library. We've gone from a couple hundred courses to thousands,' said Tamer Ali, director of product management for the e-learning application service provider. The company has contracts with the General Services and Social Security administrations and the Veterans Affairs Department. About 25 percent of VCampus' business now is federal, compared with less than 3 percent two years ago, said Ron Freedman, vice president of government and security solutions. Corporate buyers are following the government's lead in requiring SCORM-compliant e-learning technology, others said. 'You may be out of business' if you don't follow SCORM, said Michael Parmentier, a principal of Booz Allen Hamilton Inc. consultants in McLean, Va. Parmentier, former director of readiness and training at the Office of the Secretary of Defense, is a founder of the ADL Initiative... Many agencies want to share content, said Mike Fitzgerald, e-training project manager at the Office of Personnel Management. OPM manages a cross-agency training initiative, the Gov Online Learning Center. Since its July launch, about 40 agencies have signed up to train their employees via More than 30 free courses are available on topics ranging from project management to government ethics. In January, the center will make more courses available on a fee-for-service basis, Fitzgerald said. OPM requires commercial content to conform to SCORM and asks agencies that want to share custom content to do the same, he said'..." See the reference following and "Shareable Content Object Reference Model Initiative (SCORM)."

  • [December 23, 2002] "Advanced Distributed Learning Emerging and Enabling Technologies for the Design of Learning Object Repositories Report." Version 1.0. By Dr. Thelma Looms (Principal Learning Technologies Architect, ADL) and Clark Christensen (Lead Software Engineer, ADL). Technical Repository Investigation Report. Released 2002-12-23. Completed November 26, 2002. 66 pages. "The ADL Initiative is preparing for a world where communications networks and personal delivery devices are pervasive and inexpensive, as well as transparent to the users in terms of ease of use, bandwidth and portability. ADL development envisions the creation of learning 'knowledge' libraries, or repositories where learning objects may be accumulated and cataloged for broad distribution and use. These objects must be readily accessible across the World Wide Web or whatever forms our global information network takes in the future. This report focuses on major standards for networked repository architectures and other important infrastructure technologies that may be useful for managing SCORM conformant content. It does not provide an authoritative view, but rather focuses on those elements most likely to be of interest in future development of a SCORM Repository Application Profile..." See especially section 4.2 on IMS Digital Repository Interoperability: "The IMS Global Learning Consortium is a specification authoring organization with headquarters in Burlington, Massachusetts. The IMS Digital Repository Interoperability (IMS DRI) model is the product of the IMS Digital Repository Working Group. The goal of the IMS DRI is to provide repository technology to support the 'presentation, configuration and delivery of learning objects.' The IMS DRI Version 1.0 Public Draft Specification was approved in August of 2002. The specification is comprised of three documents, the IMS Digital Repositories Interoperability Information Model which defines the information model, describes the core functions, IMS Digital Repositories Interoperability Best Practice and Implementation Guide and the IMS Digital Repositories Interoperability XML Binding..." Learning Objects Network IMS DRI Reference Implementation: "The Learning Objects Network (LON) has developed a reference implementation of the IMS DRI core functions. The LON proof-of-concept implementation was evaluated as part of the ARTI in the summer of 2002... the proposed architecture was developed in Python and Java and uses a Native XML database to store SCORM Meta-data. The architecture was designed for the following components: (1) a learning object registry, (2) a learning object repository, (3) an XML meta-data search engine, (4) a Client message broker... All the components interact with other components using a set of SOAP web services. The learning objects in the repository are identified using a unique identifier implemented using the IDF's Digital Object Identifier (DOI). The LON Message Manifest Specification [LONM02] describes a messaging API that supports the basic functions for a LON repository. The Message Manifest describes the structure and format of the messages exchanged by services within the repository and registry. Messages are defined for all the functions provided by the client and server components of the system..." See: (1) ADL source .DOC; (2) "Shareable Content Object Reference Model Initiative (SCORM)."

  • [December 23, 2002] "Working Group Tests Tools for Web Services." By Patricia Daukantas. In Government Computer News Volume 21, Number 34 (December 16, 2002), pages 1, 9. Includes photo of Robert Haycock. "XML Collaborator isn't yet a saleable product, but it's already part of a pilot. The Extensible Markup Language development tool -- from start-up Blue Oxide Technologies LLC of Charles Town, W.Va. -- is one of six incubator projects fostered by a new CIO Council group. The XML Web Services Working Group is counting on the half-dozen projects to inject Web services into the Office of Management and Budget's 25 e-government initiatives, said Brand L. Niemann, an Environmental Protection Agency computer scientist who chairs the new group. Meanwhile, the Federal Enterprise Architecture Program Management Office expects to release drafts of two technology-related reference models early next year. The draft technical reference model (TRM) and service component reference model (SRM) will probably be launched together next month, acting program manager Robert Haycock said last week at the XML 2002 conference. SRM will describe software components that agencies can quickly assemble into applications, Haycock said. TRM will identify interoperability and reuse technologies -- down to the product level in some cases. The working groups that are drafting the enterprise architecture models still have 'a lot of conceptualizing' to do on a draft data reference model for the architecture, Haycock said. Current plans call for a DRM repository of XML schemas and metadata describing data common to multiple agencies' business processes. 'One thing we're determined not to do is create our own standards,' Haycock said. 'We want to use the commercial standards already out there.' The critical specifications are XML, Universal Discovery, Description and Integration (UDDI), the Simple Object Access Protocol (SOAP), and Web Services Description Language (WSDL)..." See the XML Collaborator description from Blue Oxide: "XML Collaborator is a collaborative design and registry platform for XML document structures and XML Web service interfaces. It has been designed from the ground up to be as flexible as possible, and to promote reuse as strongly as possible. XML Collaborator is unique in that it decouples data structures (the shape of the information) from the actual description of the data structures (XML Schemas, WSDL XML Web service interface descriptions, DTDs, and so on). This decoupling increases the potential reuse of the data structures, as well as providing a more intuitive mechanism for the creation of those structures... XML Collaborator has been selected as one of the six incubator pilot projects of the Federal CIO Council's XML Web Services Working Group. Recently formed as part of the Leveraging Technology Subcommittee of the Federal CIO Council's Architecture and Infrastructure Committee, the goal of the XML Web Services Working Group is to 'accelerate the effective and appropriate implementation of XML Web services technology in the federal government.' In pursuit of this goal, XML Collaborator is being used to define, register and publish federal XML Web service definitions and XML Schemas to provide support for various E-Gov initiatives and the other incubator pilot projects. Using XML Collaborator also allows government personnel to gain experience with the emerging 'publish, find, bind' paradigm associated with the service-oriented architecture of XML Web services..." See reference following.

  • [December 23, 2002] "XML Collaborator: XML Design Collaboration and Registry Software." White Paper from Blue Oxide Technologies, LLC. September 2002. 16 pages. "... ISO 11179 provides a mechanism for the registration of individual data elements; ebXML's registry model describes a registration strategy for XML Schema; and the UDDI specification details the registration of Web services. However, these functions do not stand alone -- they are actually interrelated. XML Schema are built up out of individual data points, and Web services build upon those schema. In an ideal registry implementation, all of the metadata for data elements, XML Schema, and Web services should draw from the same registry. Otherwise, data elements and XML Schemas will need to be registered multiple times (either on their own or as part of other structures), leading to potential synchronization errors as a particular enterprise metadata collection effort grows and changes over time. Granular reuse of information design elements A robust registry will allow specific data elements, structures, datatypes, and enumerations to easily be reused across the enterprise. The registry specifications that exist to date do not directly provide support for the reuse of individual data structures (XML 'elements') and data points (XML 'text elements' or 'attributes') outside of the context of an XML Schema. The registry should allow these information components to be individually tracked and versioned, and any other part of the registry should be able to leverage these components in other structures... In a true enterprise-class solution, there may be thousands of different participants in the information design and registry process. In many cases, there will be a certain level of overlap in the information needs of the various participating entities; on the other hand, many information items will be proprietary to a particular stakeholder. There are three approaches to this problem... collaboration and registry share many of the same design goals. The results of a collaboration process (finalized structures or interfaces) are themselves published as work products in a registry. If a single platform manages both the design of the data structures over time as well as the sharing of those data structures through a registry, then the entire lifecycle of those structures is encompassed by that platform. The atomic level of structure management available through a collaboration platform encourages reuse of those structures, leading to registered structures and interfaces that interoperate with other systems as much as possible. Blue Oxide has created its XML Collaborator product to serve as both a collaboration platform for the design of information structures as well as a registry for the sharing of those structures with potential users. the XML Collaborator product has been designed from the ground up to be as flexible as possible, and to promote metadata reuse as strongly as possible... The XML Collaborator platform consists of a core metadata tracking database and a series of XML Web service interfaces to that information. The engine also provides a set of shared registry services that allow XML Collaborator servers registered with one another to share information and reuse components across the Distributed Registry Bus... XML Collaborator ships with a browser-based client that uses the Web service interfaces to interact with the registry. However, the service descriptions for the engine Web services are published -- so other systems could be built that take advantage of the XML Collaborator engine infrastructure but that use some other form of user interface. It will of course also be possible for programmers to interact with an XML Collaborator engine directly through the Web services, allowing more sophisticated behavior such as interfaceless interaction with the repository to be easily implemented..." See the preceding reference and the XML Collaborator Datasheet.

  • [December 23, 2002] "JAXB Revisited." By Daniel F. Savarese. In Java Pro Magazine (February 2003). ['With an updated version and specification draft now available, get reacquainted with JAXB.'] "The JAXB specification has benefited from developer feedback over time, sacrificing some features and adding new ones in the process. Let's see what it looks like today, with a reasonable assurance that the changes between the beta and final release will not be great... As a quick review for the uninitiated, JAXB defines an architecture for binding XML schemata to Java objects. These bindings allow you to unmarshal XML documents into a hierarchy of Java objects and marshal the Java objects into XML with minimal effort. If you work a lot with XML, you know how tedious it can be to write Simple API for XML (SAX) or Document Object Model (DOM) code to convert XML into Java objects that mean something to your program. JAXB generates code automatically so you can go about the business of processing data instead of parsing it. The basic JAXB design remains the same as in the early access release, but the details have changed. You still have to define a schema binding that is compiled by a schema-binding compiler. The schema-binding compiler generates Java interfaces and classes based on the binding. Whereas early versions of JAXB supported only Document Type Definitions (DTDs) and required that a schema binding be defined separately, the latest version of JAXB supports XML schema definitions and allows additional binding declarations to be defined inside of the schema using XML schema annotations. The JAXB specification also allows bindings to be defined in a separate document, but the beta release of the reference implementation supports only schema annotations... The first half of the JAXB architecture is the schema-binding system. The second half is the Java API contained in the javax.xml.bind package, with which you make use of the classes generated by the schema compiler. The JAXB APIs have evolved considerably and have arrived at a design that cleanly separates functionality. The javax.xml.bind package defines a set of interfaces and one class, JAXBContext, which acts as an entry point to the rest of the API functionality. The three most important interfaces are Unmarshaller, Marshaller, and Validator. The Unmarshaller interface defines a set of unmarshal() methods for unmarshalling XML data into a Java object. The Marshaller interface defines a set of marshal() methods for marshalling Java objects into XML data. The Validator interface defines methods for validating a Java object hierarchy against the XML schema from which it is derived as well as methods for adding event handlers to monitor and react to the validation process... Turning XML into a Java object is as simple as creating an Unmarshaller and invoking an unmarshal() method. Converting a Java object into XML is as simple as creating a Marshaller and invoking a marshal() method. Validating an object hierarchy before marshalling is as simple as creating a Validator and invoking a validator method... Despite the existence of other XML marshalling solutions, JAXB is good at what it sets out to do, and as a result you can expect to write far less SAX and DOM code in the future..." See: (1) Java Architecture for XML Binding (JAXB) 1.0 Beta Implementation; (2) JAXB website; (3) "Java Architecture for XML Binding (JAXB)."

  • [December 21, 2002] "New WS-Security Extensions May Be Too Much, Too Soon." By Ray Wagner and Terry Hicks (Gartner). Gartner FirstTake. FT-19-0712. 20-December-2002. ['A small group of providers in the Web services market announced needed extensions to the Web Services Security (WS-Security) specifications. But these extensions may limit necessary cross-industry support and innovation.'] "On 17 December 2003, BEA Systems, IBM, Microsoft, RSA Security and VeriSign released extensions to the WS-Security specifications that focus on interenterprise business and security policy: WS-Policy (with WS-Policy Attach, WS-Policy Assert), WS-Trust, WS-Secure Conversation, and WS-Security Policy... The new extensions to the original WS-Security specification could provide a basis for strongly securing Web services, especially at the levels of interenterprise trust, security and business policy agreement. However, the timing of these extensions could be a problem. That specification (which is not yet a standard) provided a framework for securing Web Services transactions via granular security mechanisms for message elements, and it received broad industry support. The new extensions deal with more complex issues concerning complete conversations and business policy, where more room exists for disagreement -- and for innovation, perhaps by startups or newer entrants. Although WS-Security's sponsors welcomed industry participation and indicated that their proposals can be extended, the industry could view the new specifications as cutting off innovation early in the process. Furthermore, industry cooperation concerning WS-Security extensions appears incomplete. The new extensions may be too ambitious for all participants to agree on, which could endanger the adoption of earlier standards, including the Security Assertion Markup Language (SAML) and the basic WS-Security proposal, being considered by a working group in the Organization for the Advancement of Structured Information Standards, a standards body. The next wave of extensions, including WS-Privacy (notably missing from this round), WS-Authorization and WS-Federation, face significant challenges in achieving industrywide support. Specifically, industry players view WS-Federation as being in opposition to the Liberty Alliance federation initiative -- a situation that will lead to considerable confusion and could make federation initiatives only marginal to the industry..." See references in the 2002-12-18 news item: "Microsoft and IBM Publish Six New Web Services Security and Policy Specifications." [HTML version]

  • [December 20, 2002] "Web Services Internationalization Usage Scenarios." W3C Working Draft 20-December-2002. First public draft. Edited by Kentaroh Noji (IBM), Martin J. D|rst (W3C), Addison Phillips (webMethods), Takao Suzuki (Microsoft), and Tex Texin (XenCraft). Latest version URL: Also available in non-normative XML format. Produced by the Web Services Internationalization Task Force of the W3C Internationalization Working Group, as part of the W3C Internationalization Activity. The document "describes a variety of Web Services internationalization usage scenarios and use cases. Some of the usage scenarios and use cases of this document may be used to generate internationalization requirements for Web Services technologies, and best practices for the internationalization of Web Services... The goal of the Web Services Internationalization Task Force is to ensure that Web Services have robust support for global use, including all of the world's languages and cultures. The goal of this document is to examine the different ways that language, culture, and related issues interact with Web Services architecture and technology. Ultimately this will allow us to develop standards and best practices for those interested in implementing internationalized Web Services. We may also discover latent international considerations in the various Web Services standards and propose solutions to the responsible groups working in these areas. Web Services provide a world-wide distributed environment that uses XML based messaging for access to distributed objects, application integration, data/information exchange, presentation aggregation, and other rich machine-to-machine interaction. The global reach of the Internet requires support for the international creation, publication and discovery of Web Services. Although the technologies and protocols used in Web Services (such as HTTP, XML, XML Schema, and so forth) are generally quite mature as 'international-ready' technologies, Web Services may require additional perspective in order to provide the best internationalized performance, because they represent a way of accessing distributed logic via a URI. As a result, this document attempts to describe the different scenarios in which international use of Web Services may require care on the part of the implementer or user or to demonstrate potential issues with Web Services technology..." See the archive for the public discussion list.

  • [December 20, 2002] "XML For The Masses: An Open Office XML File Format." By Daniel Vogelheim (Software Engineer, Sun Microsystems). Prepared for presentation at the IDEAlliance XML 2002 Conference, December 10, 2002. 18 pages (from the presentation slides). "XML Office File Format is designed to support 'Matching Customer's Requirements' -- in XML, for easy integration and processing of a complete office vocabulary. Requirements for the office file format are that it be: full featured, covering full office productivity space, easy to process and easy to generate; not vendor or application specific. Why a Fixed Office Vocabulary in an XML Office Format: (1) allows traditional usage patterns; (2) mass-market compatible; (3) add XML processing as needed; (4) tools can operate on semantic units; (5) transform into custom vocabularies. Use of Custom Vocabularies: work well in specialized, structured work-flows but requires tooling, extensive preparations...[adapted]" Context is provided in the extended abstract "XML For The Masses - An XML Based File Format for Office Documents": "... In this talk, we examine the design rationale of the XML File Format and present how the use of XML streaming can be used inside applications to simplify document processing. Furthermore, we will introduce some uses of the format outside of its supporting applications. In the format definition, a fundamental decision was made that the format was to be designed: In order to fully realize the XML promise of data exchange it is not sufficient to simply encode existing program structures in XML syntax. Instead, an explicit, reviewed design process was established to ensure that additional benefits could be realized by the format. This common theme led to the definition of three design principles which governed the format definition: (1) Use existing standards: don't reinvent the wheel. The use of existing standards is embodied by generous 'borrowing' of elements and structures from e.g., HTML, XSL, CSS, SVG, Dublin Core, XLink, and MathML; (2) Transformability: the format must be usable outside of the office application. A consistent design makes it possible for transformation developers to focus on their area of interest, allowing them to ignore the remainder of the format. Similarly, a limited redundancy between presentation and content allows processing tools to be aware of either aspect. The format features a unique approach for dealing with layout and content of a document, in that both must be contained in an office document to allow faithful output reproduction, but should be separate to allow easy processing and generation. (3) First class XML: all structured content must be accessible through XML structures All structured information embodied in the document must be accessible as XML elements and attributes, thus making them fully accessible to XSLT and similar XML based tools. No information is stored in 'special' comments or names, and no information is encoded in strings that require elaborate parsing to be understood... This office documents XML format is also used within the application for file format conversion, or "filters" in parlance. Using XML turns file format filters into XML transformations into or from the office document XML format. This use of a documented, human readable format simplifies both filter development and debugging. The inefficiencies associated with XML processing for large documents can be mitigated by using XML pipelines based on the Simple API for XML, SAX. Storing documents in a transformation friendly XML format allows users to access and manipulate their office documents using standardized tools. Support for attaching arbitrary XML attributes to certain XML elements should foster better integration of office documents into content management systems or custom solutions..." Eric van der Vlist noted that "Jean Paoli for Microsoft and Daniel Vogelheim for OpenOffice both chose the same title 'XML for the masses' for their presentations, a commonality which hides two very different approaches from the editors of two competing office productivity suites..." General references in "XML File Formats for Office Documents." [ format == ZIP]

  • [December 20, 2002] "In the Navy, XML is the Standard." By Darryl K. Taft. In eWEEK (December 19, 2002). "The Department of the Navy has implemented a new policy for using XML throughout the service. The Navy's chief information officer, David Wennergren, signed off on the policy last Friday. The Navy first established a vision for using XML in its systems in March. The Navy's vision document said it would 'fully exploit Extensible Markup Language as an enabling technology to achieve interoperability in support of maritime information superiority'... Michael Jacobs, data architecture project lead in the Navy's chief information office, said: 'The Department of the Navy has been actively developing XML implementation plans since the stand-up of the DONXML Work Group in August 2001.' Jacobs said the Navy's Work Group is comprised of five action teams: strategic vision and approach, standard implementation, enterprise implementation, marketing and outreach, and integration with existing DON processes. Meanwhile, Jacobs named as some of the key elements of the new XML policy: direction on use of Voluntary Consensus Standards (VCS) from W3C, OASIS, and other internationally recognized standards bodies; direction on active DON participation on VCSs; direction on reuse of XML components and order of precedence for reuse, prior to development of new components..." See details at "DON Policy on the Use of Extensible Markup Language (XML) of December 2002."

  • [December 20, 2002] "Universal Business Language (UBL): Realizing eBusiness XML." By Mark Crawford (Logistics Management Institute; Vice Chair, OASIS UBL TC; Editor, UN/CEFACT Core Components). From a Plenary Presentation at XML 2002. 60 slides. Also in (source) .PPT format. Section 3 describes the UBL Relationship to ebXML Core Components. "UBL is: Jon Bosak's brainchild; An OASIS Technical Committee; An implementation of ebXML Core Components; An XML-based business language standard-in-progress; A cross-sector XML solution; A Non-proprietary solution that is committed to freedom from royalties; A future legal standard for international trade; The ebXML missing link... Benefits of UBL: (1) Transparent and efficient interface naming and design rules; (2) Harmonization and standardization of business objects; (3) Transparent rules for customer specific interface modifications; (4) Plugs directly into existing traditional business practices; (5) Interoperable with existing EDI systems... UBL Development Strategies: (1) Start with the low-hanging fruit - The 20% of documents and business objects actually used by 80% of electronic business partners; (2) Defer the rocket science to later phases - Produce useful, concrete outputs ASAP; (3) Don't start with a blank slate - We are working from xCBL 3.0, but with no expectations of backwards compatibility; (4) Take advantage of domain expertise - Get XML experts and business experts together and form liaisons... UBL Deliverables include (1) Naming and design rules for UBL XML schemas; (2) Library of standard XML business information entities (BIEs); (3) Set of standard XML business documents - purchase order, invoice, shipping notice, price catalogue, etc.; (4) Context methodology to make the standard documents interoperate across industries... Basic UBL Documents include: (1) Procurement - Purchase Order, P.O. Response, P.O. Change; (2) Materials management - Advance Ship Notice, Planning Schedule, Goods Receipt; (3) Payment - Commercial Invoice, Remittance Advice; (4) Transport/logistics - Consignment Status Request, Consignment Status Report, Bill of Lading; (5) Catalogs - Price Catalog, Product Catalog; (6) Statistical reports - Accounting Report.... UBL Differentiators: (1) Completely open, public, accountable standards process; (2) Non-proprietary and royalty-free; (3) Based on UN, OASIS, and W3C specifications; (4) Intended for normative status under international law; (5) Designed for B2B; (6) Intended for exchange of legal documents; (7) Human- and machine-readable; (8) Compatible with existing EDI systems..." See also the prepared text from "UBL: The Next Step for Global E-Commerce" in the Conference Proceedings. See: (1) OASIS Universal Business Language (UBL) TC website; (2) general references in "Universal Business Language (UBL)."

  • [December 20, 2002] "Schema Rules for UBL... and Maybe for You." By Eve Maler (Sun Microsystems). Presented at the XML 2002 Conference. 12-December-2002. 46 pages (PDF from slides). "Goals of the presentation are to: (1) Introduce the Universal Business Language and its unique schema requirements and constraints; (2) Describe three major areas of its design, introducing the ebXML Core Components model along the way; (3) Help you decide whether you want to apply any of these design rules to your own project, B2B or otherwise... UBL is an XML-based business language standard being developed at OASIS (though not officially part of ebXML) that: leverages existing EDI and XML B2B concepts and technologies; is applicable across all industry sectors and domains of electronic trade; is modular, reusable, and extensible; is non-proprietary and committed to freedom from royalties; is intended to become a legal standard for international trade... Requirements on schema design were to: (1) Leverage XML technology, but keep it interoperable; (2) Achieve semantic clarity through a binding to the Core Components model; (3) Support contextualization (customization) and reuse; (4) Selectively allow 'outsourcing' to other standard schemas. The special requirement was defined for 'context': Standard business components need to be different in different business contexts (addresses differ in Japan vs. the U.S.; addresses in the auto industry differ from those for other industries; invoice items for shoes need size information; for coffee, grind information). UBL needs this kind of customization without losing interoperability... A constraint on the design rules themselves arose from the the UBL Library design, being specified in syntax-neutral form using the Core Components model; a spreadsheet holds the results; to convert this automatically into XML schema form requires hard rules, not just guidelines..." The prose/prepared text for the presntation is available from IDEAlliance. Its summary: "The OASIS Universal Business Language (UBL) effort has several interesting goals and constraints that must be taken into account in the structuring of the UBL Library schemas. This paper discusses some of the major rules developed for the design of the schemas: UBL's connection to the UN/CEFACT ebXML Core Component Technical Specification, its choice of options for element and datatype definitions, and its solution for reusable code lists. The common thread in these rules is the need to achieve a solution that is simultaneously intuitive, flexible, interoperable, and based on standardized semantics. At a time when much W3C XML Schema usage is still experimental in nature, particularly in the development of internationally standard XML vocabularies, the UBL Naming and Design Rules subcommittee has delved into many subtle issues involved in the art of 'schemography' [credits Murray Maloney] and we hope they may be helpful to others..." See: (1) published deliverables from the UBL Naming and Design Rules Subcommittee (NDR SC), specifying rules and guidelines for normative-form schema design, instance design, and markup naming; (2) OASIS Universal Business Language (UBL) TC website; (3) general references in "Universal Business Language (UBL)." [PDF source]

  • [December 20, 2002] "W3C, Vendors Dust Down Over WS-Choreography?" By [Gavin Clarke]. In The Register (December 20, 2002). "A New Year stand-off is brewing, after a leading standards group kicked off its ratifcation procedure for Web services choreography but three major vendors remained aloof from the process, Gavin Clarke writes. The World Wide Web Consortium (W3C) is creating a working group to develop a standard for WS-Choreography, a proposed means to describe the flow of messages exchanged by a Web service in a particular process. Approximately six specifications are floating around the industry, and the W3C said it hopes to consolidate at least three. However, the owners of one major specification -- Business Process Execution Language for Web services (BPEL4WS) -- are keeping mum over which standards body will receive their offering. BPEL4WS is owned by BEA Systems Inc, IBM and Microsoft Corp. It is feared a failure to integrate different choreography specifications will create different implementations and limit web services uptake. Oracle Corp, helping push W3C's efforts, called this a 'worst case scenario'. 'All that does is confuse the market place,' Oracle vice president of standards strategy and architecture Don Deutsch told ComputerWire. Licensing of the vendors' intellectual property (IP), used in web services specifications, is believed to be a potential sticking point in this issue. IP licensing emerged as the web services bogey man in 2002. It is believed vendors who do not license IP under a Royalty Free (RF) mechanism leave the door open to charge royalties at a later date under the Reasonable And Non-Discriminatory (RAND) model... Furthermore, as IBM, Microsoft and VeriSign prepared to go public with WS-Security, one Microsoft W3C representative voiced concern that any public W3C technical discussions would 'create IP risks' for members for any related specifications... [IBM's Karla] Norsworthy denied Microsoft and IBM went forum shopping over WS-Security, saying OASIS was a suitable group given its existing work on the related Security Assertion Mark-up Language (SAML). Redwood Shores, California-based Oracle called IP a 'red herring' as vendors instead choose forums on the basis of how much they can control development of a proposed specification. Oracle is a big player behind W3C's WS-Choreography proposed working group. Deutsch said: 'People make decisions for various reasons to do with maintaining control of the specification.' He added, though, W3C is the best place for WS-Choreography because of its use of RF. 'If your objective is to anoint your technology then your decision will be to go somewhere else because you won't want a level playing field,' Deutsch said. 'We believe in encouraging IBM and Microsoft to do the right thing, and bring BPEL4WS to the W3C table,' Deutsch said..." See: (1) "[W3C] Proposal for Web Services Choreography Working Group Charter"; (2) PMLs from; (3) "Business Process Execution Language for Web Services (BPEL4WS)"; (4) "Web Service Choreography Interface (WSCI)"; (5) "Patents and Open Standards."

  • [December 19, 2002] "Running Multiple XSLT Engines with Ant." By Anthony Coates. From December 11, 2002. ['Every XSLT engine unfortunately behaves in a slightly different way to the others. This can cause problems when you want your XSLT stylesheets to be as interoperable as possible. Tony Coates writes about how to solve this problem using the build tool Ant; he shows how to use Ant to control the build pipeline for XML and XSLT, and then builds a project that uses multiple XSLT engines.'] "Ant is a build utility produced as part of the Apache Jakarta project. It's broadly equivalent to Unix's make or nmake under Windows. make-like tools work by comparing the date of an output file to the date of the input files required to build it. If any of the input files is newer than the output file, the output file needs to be rebuilt. This is a simple rule, and one that generally produces the right results. Unlike traditional make utilities, Ant is written in Java, so Ant is a good cross-platform solution for controlling automatic file building. That is good news for anyone developing cross-platform XSLT scripts because you only need to target one build environment. Anyone who has tried writing and maintaining equivalent Windows and Unix batch scripts knows how hard it is to get the same behavior across different platforms. So why would you use Ant and XSLT together? If all you are doing is applying a single XSLT stylesheet to a single XML input file, using a single XSLT engine, then there is probably nothing to be gained. However, if [1] you need to apply one or more XSLT scripts to one or more XML input files in some sequence, in order to build your final output file(s); [2] you need to run multiple XSLT engines on the same XML input file(s) as part of your regression or integration testing then Ant is a good and quick way to implement the workflow you need to transform your input(s) into your output(s)..." See also Ant: The Definitive Guide, by Jesse E. Tilly and Eric M. Burke (O'Reilly, May 2002).

  • [December 19, 2002] "A Python & XML Companion." By Uche Ogbuji. From December 11, 2002. ['In his monthly Python & XML column, Uche Ogbuji provides an overview and updates to O'Reilly's Python & XML book, and keeps us up to date with the latest developments in the Python XML world.'] "Python & XML, written by Christopher Jones and Fred Drake, Jr. (O'Reilly and Associates, 2002), introduces Python programmers to XML processing. Drake is a core developer of PyXML, and Jones is an experienced developer in Python and XML, as well as an experienced author. As you would expect from such a team, this book is detailed and handy; however, I have a few notes, amplifications, and updates to offer (the book was released in December of 2001) -- all of which are distinct from the errata that the authors maintain. In this article I will provide updates, additional suggestions, and other material to serve as a companion to the book. You don't have to have the book in order to follow along... Python & XML is a very handy book. The examples are especially clear, and in the latter part of the book the authors develop a sample application which uses much of the book's contents very practically. My main complaint is that it covers XML namespaces so sparsely. Namespaces are very hard to avoid these days in XML processing, regardless of what you may think of them. More examples and coverage of where namespaces intersect DOM, XPath, XSLT, and so on would help a lot of readers. I plan to write an article focusing on XML namespaces in Python processing..." See: "XML and Python."

  • [December 19, 2002] "Test Frameworks for W3C Technologies." By Dimitris Dimitriadis. From December 11, 2002. ['Dimitris Dimitriadis is concerned with interoperability. Dimitris outlines the difficulties inherent in implementing specification-conformant processors, and suggests ways that specifications themselves could be improved.'] "Writing specifications is great. Implementing them is difficult. Specifications, at least those from the W3C, aim at interoperability and are written with wide acceptance in mind. However, they tend to be difficult to implement... There seems to be very good reasons for making it easier to write conformant implementations, which in turn means that testing them must be made easier, too..., the W3C has formed the Quality Assurance Working Group. Its goals, among others, are (1) To help the W3C in furthering its aim to lead the web to its full potential, paying special respect to its end goals; (2) To support Working Groups in writing clear, testable specifications of high quality; (3) To support production of Test Suites that are used to check conformance to specifications; (4) To describe the process Working Groups should use when writing specifications and producing test suites. Reading through this list, I realize that it sounds like boring, bureaucratic, and method-centric work. But that is, however, the price that has to be paid to ensure that the goal is met. If the goal is to create a number of unified frameworks to simplify testing of all W3C specifications, and to allow for an easier way to write conformant implementations, then the effort will be proportionally heavy... Among W3C Working Groups there is no official understanding of what a test or a test framework really is. Some Working Groups have produced a series of tests that they use to show that their implementations are conformant. Others have created public projects in which the test suites are produced. Except for the fact that it is at the Working Group's discretion to decide how to produce a test suite and how advanced to make it, for an implementor it's not easy to use these test suites, as each one has its own particular setup. In addition, some implementors have reported that they cannot use test suites that have been released for reasons of policy or license. The goal should be to allow as many people as possible to test implementations to assure they are conformant. This should be a basic goal for the W3C and, in particular, for its members. When your client asks how much of your web-services oriented framework follows existing standards you should want and be able to tell the truth. There's much to be gained from using uniform test frameworks. During development, it's easier to track how well your implementation is doing, a uniform test framework means uniform reporting, issue tracking, and the like... There's been a discussion about forming a devoted Test Group within the W3C. Its outreach would be similar to what the W3C TAG is doing in terms of generality, but limited to test issues..."

  • [December 19, 2002] "Automatic Numbering, Part Two." By Bob DuCharme. From December 11, 2002. ['Here we have the second part of "Automatic Numbering", part of Bob DuCharme's "Transforming XML" column. This month, Bob shows how to control section numbering and examines an alternative to <xsl:number> for simple numbering tasks.'] "In last month's column, we saw how XSLT's xsl:number element lets you add numbers to your output for numbered lists or chapter and section headers. We looked at some of the attributes of this element that let you control its appearance -- for example, how to output the numbers as Roman numerals, as letters instead of numbers, and how to add leading zeros. We also saw the basics of numbering sections and subsections, but only the basics. This month we'll learn how to gain real control over section numbering, and we'll look at a more efficient alternative to xsl:number that's sometimes better for simple numbering..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [December 19, 2002] "The Role of XML in Content Management." By Lauren Wood. In The Gilbane Report Volume 10, Number 8 (October 2002), pages 1-8. ['The management of semi-structured or unstructured data has always depended on markup languages. Before the Web it was SGML or proprietary markup languages, now it is XML. This dependence was mutual in practice, unstructured information management (mainly for publishing) was the only use of markup. However, the wild success of XML is due to its acceptance as a way to encode and share all kinds of structured and unstructured data, including code. Ironically, advocating XML for content or document management was actually disparaged by many early XML evangelists because they were afraid XML would be seen as being limited to publishing-oriented applications. This was in spite of the fact that, while most XML development was targeting application integration, most deployment was for content applications. Today, you wouldn't implement a content management solution without thinking very carefully about what role XML should play. Should it be used for content, for metadata, for application integration, for information integration? Where in the create/manage/deliver cycle should it be used? Where do Web Services fit in? What about WebDAV? Contributor Lauren Wood looks at how businesses are actually using XML in content management implementations, and how they view XML's role in the future. Laurens report will provide you with an outline to help you organize your thoughts about the role XML should play in your content management implementation.'] "XML is an extremely flexible technology that can fulfill several roles in any software application. Content management is no exception to this. This survey discusses some of the roles that XML can play in a content management system (CMS) and whether there is much industry support or customer demand for such support. On speaking with several people representing companies and customers in this area, I found that there is increasing demand and good support for XML for content; less support or demand for XML metadata support; increasing support but little current demand for Web Services, and some support but more demand for WebDAV... This article will talk about content management or CMS (content management systems) and include all the variations that are appropriate, such as information management, knowledge management, or document management. Yes, these are all different. In terms of where XML can be used, however, they are similar enough to justify lumping them all together under one label. XML use in a CMS can be divided into two main categories: (1) XML for content (including metadata), and (2) XML for plumbing (including Web Services)... Metadata is the connecting tissue for all CMSs. It tells the CMS what the content is, who created it, who may read it, who may change it, where it fits in the workflow, and what sorts of operations may be performed on it. Metadata can do more, however... Metadata can be stored as XML, as indexes in a relational database, or in some CMS-specific storage format. For some purposes, the format it is stored in is irrelevant. Metadata that is more volatile than the underlying content, such as stage of a workflow process, or date the item moved from one stage to the other, is often stored outside of the XML... Another piece of the puzzle that uses XML as plumbing, WebDAV is a relatively unknown specification that enables lightweight content management. It functions as a set of extensions to the web protocol HTTP (unlike Web Services, which can also function via other protocols such as email). These extensions are defined using XML. WebDAV (often called DAV for short) allows for basic CM functionality such as locking and metadata assignment; versioning is still being developed. It is not sufficient for a full-blown, all-the-bells-and-whistles CMS, but adequate for a lot of smaller uses where all the features of a large, expensive CMS are not needed. Once versioning has been added to WebDAV so that the basic check-in and check-out is supported, it will do much of what small groups of people need. There appears to be some customer demand for WebDAV in various tools such as XML authoring tool vendors, so that they can implement their own basic CMS..."

  • [December 19, 2002] "XPointer xpointer() Scheme." W3C Working Draft 19-December-2002. Edited by Steven DeRose (Brown University Scholarly Technology Group; Bible Technologies Group), Eve Maler (Sun Microsystems), and Ron Daniel Jr. (Taxonomy Strategies). Latest version URL: Previous version: A posting from Ron Daniel (Acting Chair, W3C XML Linking Working Group) announces this release of the new W3C Working Draft: "People who are interested in a fully-featured method of pointing into XML documents may wish to check it out..." Abstract: "The XPointer xpointer() scheme is intended to be used with the XPointer Framework to provide a high level of functionality for addressing portions of XML documents. It is based on XPath, and adds the ability to address strings, points, and ranges in accordance with definitions provided in DOM 2: Range... This scheme supports addressing into the internal structures of XML documents and external parsed entities. It allows for examination of a document's hierarchical structure and choice of portions based on various properties, such as element types, attribute values, character content, and relative position. In particular, it provides for specific reference to elements, character strings, and other XML information, whether or not they bear an explicit ID attribute..." Status: "This specification is one of four into which the prior XPointer specification has been divided. This version addresses comments received on the XPointer Candidate Recommendation which were relevant to the xpointer() scheme it defines. Except for responding to the relevant Last Call comments, and incorporating non-substantive editorial improvements, this documents is substantially identical to that part of the Last Call XPointer specification which is not covered by XPointer Framework, XPointer xmlns() Scheme, and XPointer element() Scheme." See the comments archive and the local references for XPointer.

  • [December 19, 2002] "Web Services Giants Propose Specifications For Security, Policy." By Mitch Wagner. In InternetWeek TechWeb (December 18, 2002). "Leading Web services vendors including Microsoft, IBM, and BEA Systems on Wednesday introduced a set of proposed standards for security and policy for Web services. The companies, also including RSA Security, SAP AG and VeriSign, introduced six new specifications built on SOAP. WS-Trust describes a framework for managing trust relationships between enterprises. WS-SecureConversation describes technology for setting the context for exchanging multiple messages without having to re-authenticate each time. WS-SecurityPolicy provides standards for setting security policies for services. These standards were authored by IBM, Microsoft, RSA, and VeriSign. Additionally, WS-Policy sets specifications for senders and receivers of Web service to communicate requirements and capabilities to search for and discover information needed to access the service. WS-PolicyAttachment provides specifications for attaching requirement and capability statements to Web services, and WS-PolicyAssertions describes policies that can be affiliated with a service... Important specifications still being developed include WS-Federation, which provides means of describing the trust relationship between organizations, and WS-Privacy, which sets specifications for privacy policies, said Gerry Gebel, analyst with the Burton Group... 'The way the authors are going about it, their plans are to submit to a standards organization where Sun, Oracle, Entrust and everyone who's now out of the picture can work on standardizing the specification in a more open and organized framework,' Gebel said..." See details in the 2002-12-18 news item: "Microsoft and IBM Publish Six New Web Services Security and Policy Specifications."

  • [December 19, 2002] "Using XSL Formatting Objects for Production-Quality Document Printing." By Kimber W. Eliot (ISOGEN International, LLC). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. 16 pages. ['PDF paper version produced from XML source via XSL, Saxon and Antenna House's XSL Formatter product.'] Abstract: "The XSL Formatting Objects specification has been a published recommendation for over a year. During that time a number of commercial XSL-FO implementations have become available that make it possible to use XSL-FO for true production-quality creation of printed documents. While there are functionality limitations in XSL 1.0 that limit the types of page layouts that can be created, the page layouts that are supported are sufficient for most technical documentation, such as user manuals, maintenance manuals, and so on. This paper evaluates the XSL 1.0 specification and the currently-available implementations against the print requirements typical of a variety of document types, including technical documents, magazines, and newspapers. We then report our experience in using XSL-FO with commercial tools to produce hardware user manuals for a line of consumer computer peripherals. We discuss the XSLT and XSL issues, as well as XSL-FO extensions that may be required to satisfy typical print production requirements. Finally, we provide a set of recommendations, based on the current state of the XSL specification and the current state of tools, as to when the use of XSL-FO is appropriate and which XSL-FO implementations are best suited to which tasks or disallowed by certain sets of requirements." Summary: "Our experience to date with using XSL-FO for production of high-quality printed documents, primarily technical manuals, has been tremendously positive. At almost every step of the process of implementing FO-based publishing solutions the task has been easier than we expected. We have been pleasantly surprised by the how much easier it is to create and maintain FO style sheets using XSLT as the transformation technology than any other publishing technology we have used in the past. We have been impressed by the quality of the FO implementations and the responsiveness of vendors to problem reports and feature requests." Note the update provided in the posting "Production Quality XSL-FO": "This paper reflects the state of the world as of late October 2002. Since I wrote it, the following FO implementations have been released or announced: (1) Adobe Document Server. Includes an FO-to-PDF component based on top of Framemaker. Doesn't implement as many features as XEP or XSL Formatter, but lets you use Frame-specific functionality from the FO, such as Frame templates. Provides some extensions, including a revision bar mechanism. (2) SUN's xmlroff open-source FO implementation. Written in C. Provides internationalization features, including support for right-to-left writing modes. Feature support is 'almost basic conformance'. Should be released Real Soon Now (just waiting for final approvals). (3) 3B2's to-be-named 'free or almost-free' FO implementation, built on top of the 3B2 formatting engine. (4) Antenna House announced Unix and Linux support in 1Q2003 (they are doing final release testing now). (5) IBM's XFC FO implementation was released as part of a larger AFP product. IBM assures me that it is under active development... It was standing room only by the time I was done presenting my talk, which is pretty good considering I was up against Norm Walsh. From the questions I got asked, it was clear that a lot of enterprises are starting to investigate the use of XSL-FO to do sophisticated page composition..." See the source at IDEAlliance and other ISOGEN white papers. Related resources: (1) "What Is XSL-FO? When Should I Use It? [Tech Watch]," by Stephen Deach; (2) "XSL/XSLT Software Support"; (3) "Extensible Stylesheet Language (XSL/XSLT)." [cache]

  • [December 19, 2002] "Navy XML Policy Signed." By Matthew French. In Federal Computer Week (December 18, 2002). "Navy chief information officer David Wennergren has signed the Navy's Extensible Markup Language policy, setting the standard for how XML will be used within the service. XML facilitates information exchange among applications and systems because it enables agencies to tag data and documents. 'Interoperability is a cornerstone of [the Navy Department's] efforts to strengthen its independent operations and, subsequently, improve the warfighter's ability to find, retrieve, process and exchange information,' Wennergren said in a December 13, 2002 statement to Navy commanders. 'The department, like many government and private-sector organizations, has increasingly looked to XML technology to meet its data-sharing needs.' The policy's overall goals are to promote XML as a technology to help achieve interoperability throughout the Navy and serve as a guideline to support interoperability among the Navy and other DOD components. Michael Jacobs, chairman of the Navy's XML Working Group, said five teams each would have a different responsibility in determining the best way to begin implementing the policy. However, a timeline has yet to be established for when XML deployment across the Navy would be completed. The Navy's XML standard, which also applies to the Marine Corps, already has received high marks from other government XML leaders. 'I read their policy document and found it to be excellent and comprehensive -- the best I have seen in the federal government, or anywhere for that matter,' said Brand Niemann, a computer scientist and XML Web services solutions architect with the Environmental Protection Agency. Niemann also heads the XML Web Services Working Group established by the CIO Council...' See pertinent references in the article abstract from December 9, 2002: 'Navy Preps XML Policy. Policy Seeks to Drive Data Interoperability," by Matthew Frenck.

  • [December 19, 2002] "DON Policy on the Use of Extensible Markup Language (XML) of December 2002." December 13, 2002. 19 pages. "Since the Interim DON XML Policy was issued last fall, the Department has created a comprehensive governance structure for its XML efforts and set a strong example for DON partners that are implementing the technology..." For context, see (1) the previous bibliographic reference; and (2) "DON XML WG XML Developer's Guide" [Version 1.1, from the DON US Department of Navy XML Working Group, edited by Brian Hopkins; May 2002. 97 pages]; (3) "U.S. Federal CIO Council XML Working Group Issues XML Developer's Guide"; (4) Navy XML Quickplace. PDF source: Navy XML Quickplace; follow the links from "don policy/guidance" to "DON XML Policy"

  • [December 19, 2002] "What Do You Get when You Cross Excel and XML?" By [Seybold Bulletin Staff]. In The Bulletin: Seybold News and Views On Electronic Publishing Volume 8, Numbers 12 & 13 (December 18, 2002). "An odd but intriguing new type of software appeared at XML 2002 in last week in Baltimore: an XML editing and processing environment with a spreadsheet interface. Celebrate Software's Celebrate/XML Designer (and its companion server product) make it relatively straightforward to add various kinds of processing to an XML data stream-most typically, a stream of transactions or similar business records represented as XML. The Designer screen is divided into three areas. The XML source file is at the left, arranged in two columns. In the first column are the XML tags, indented to show their hierarchical relationship. In the second column is the data. At the right, again in two columns, is the 'template' area. Here, additional tags can be created and their contents specified. These new values can be calculated, spreadsheet-fashion, using a broad selection of functions. The operands can be selected in point-and-click style, as with a spreadsheet. But instead of showing up as cell locations in row-and-column notation, they show up in XML Xpath notation. The bottom area of the screen is the 'override' area. It is used to specify how results will be formatted and which data will be omitted from the final document. The result of processing is, of course, a new XML file... Most XML editors on the market today are oriented toward working with text documents. They don't help much in applications where the whole point is processing EDI-style transactions. For those applications, a professional programmer is necessary for even the simplest processing and reporting. Celebrate's products open up the world of XML-based transactions to the non-programmer, in the same way that the spreadsheet opened up database records to the non-programmer two decades ago..." See the product description and demos.

  • [December 19, 2002] "Business Process with BPEL4WS: Learning BPEL4WS, Part 4. Creating Processes With the BPWS4J Editor." By Nirmal Mukhi (Software Engineer, IBM). From IBM developerWorks, Web services. November 2002. ['BPWS4J is an implementation of the BPEL4WS specification that includes a run-time engine and an editor (which is an Eclipse plugin) for creating BPEL4WS processes. In this article Nirmal describes design approaches to creating BPEL4WS processes, and how the BPWS4J editor is used to create, modify, and validate such processes.'] "The Business Processes in Web Services for Java (BPWS4J) software package offers an implementation of a run-time engine that supports the Business Process Execution Language for Web Services (BPEL4WS) specification, which defines workflow and business processes for Web services. This package, was released on alphaWorks in August to provide a run-time engine and a workflow editing tool for creating flows. This article focuses on an implementation exercise of a simple echo process flow, where one process receives the message and the other party reflects an identical copy of the message back to the sender... A complete, deployable unit for the BPWS4J runtime consists of: (1) The BPEL4WS file describing the process; (2) a WSDL file describing the messages, operations, port types and other information (service link types, correlation properties, etc.) that are referenced by the process definition (known as the process WSDL), and (3) the WSDL definitions for each partner involved in the process, unless the process does not make use of any WSDL operations provided by the partner. Of these, the BPWS4J editor allows you to create the BPEL4WS file describing the process. To edit WSDL files required for deploying a BPEL4WS process, you can use other tools. However, these will not support non-standard WSDL extensions that have been proposed within the BPEL4WS specification, such as service links, correlation properties, etc. Those definitions will have to be added in by hand; the samples in BPWS4J should serve as a sufficient guide for this. I will explain what the BPWS4J editor looks like and how it functions while developing the echo process example..." For other installments in the series, see the Summary. Also in PDF format. See: "Business Process Execution Language for Web Services (BPEL4WS)."

  • [December 18, 2002] "Web Services Security: Moving Up the Stack. New Specifications Improve the WS-Security Model." By Maryann Hondo (Senior Technical Staff Member, IBM), David Melgar (Advisory software engineer, IBM), and Anthony Nadalin (Lead Architect, IBM). From IBM developerWorks, Web services. December 2002. "In April, IBM, MS, and Verisign jointly published a specification for Web Services Security (WS-Security) that provides a set of mechanisms to help developers of Web services secure SOAP message exchanges. This specification has been accepted by OASIS and a new Web Services Technical Committee (The WSS TC) has been formed to move WS-Security to an open standard. The WS-Security specification has been explained in some detail in an earlier paper, Security in a Web Services World: A Proposed Architecture and Roadmap. Additionally in April, IBM and Microsoft provided a roadmap document that included a conceptual stack identifying additional elements that are important to building security into Web services. The focus of this announcement is the delivery of three more parts of the roadmap; two elements in the policy layer and one in the federation layer [see figure #1 'The evolving WS-Security Roadmap']... The policy element in the Roadmap labeled WS-Policy has been further refined to include four documents: (1) A Policy Framework (WS-Policy) document that defines a grammar for expressing Web services policies; (2) A Policy Attachment (WS-Policy-Attachment) document that defines how to attach these policies to Web services; (3) A set of general policy assertions (WS-Policy-Assertions); (4) A set of security policy assertions (WS-Security Policy). The Policy Framework is designed to allow extensibility. Policy is a broad term and encompasses a range of disciplines like security, reliability, transactions, privacy, etc. Similarly, the ability to express policies is not limited to the expression of general policies or security policies. The intent is for the basic policy framework to accommodate the expression of domain specific policy languages in a way that leverages different domain knowledge within a consistent Web Services Framework. WS-PolicyAttachment offers several ways to advertise policy assertions with Web services. It builds on the existing WSDL and UDDI specifications but also supports extensibility. While there will be domain specific policies (for example, security policy) along with common policies for Web services. A common example is a requesting service that may look for a service provider that offers processing in a particular human language (for example, German). The requesting service thus applies a policy assertion, that is, the need for German language support. The provider could also make this assertion by advertising it can offer its service in German. The WS-Policy Assertions Language offers this type of common policy expression. It defines a generic set of policy assertions for Web services... WS-Trust starts the work of defining trust relationships by defining a set of interfaces that a secure token service may provide for the issuance, exchange, and validation of security tokens...WS-Secure Conversation builds on this concept of trust based on security tokens. It describes how security tokens can be used within the context of policy-defined trust relationships to allow multiple service requesters and service providers to securely exchange information over the duration of a conversation. While WS-Trust defines the behavior of overall trust relationships, WS-SecureConversation focuses on defining a security context (security token) for secure communications..." See references in the 2002-12-18 news item: "Microsoft and IBM Publish Six New Web Services Security and Policy Specifications."

  • [December 18, 2002] "Vendors Team Up On New Web Services Security Specs. Microsoft, IBM, BEA Among Authors of Proposed Standards." By Elizabeth Montalbano. In CRN (December 18, 2002). "Microsoft, IBM, BEA Systems, RSA Security, VeriSign and SAP Wednesday published a new set of advance specifications that will let companies share information securely between applications in their own systems, as well as with other enterprises, the companies said in a press release. The new specs fall into two categories, [the first being] technical issues surrounding security and business policy implementation. WS-Trust, WS-SecureConversation and WS-SecurityPolicy, co-authored by IBM, Microsoft, RSA and VeriSign, fall under the first category, according to the companies. WS-Trust is a description for managing, establishing and assessing trust relationships between parties exchanging information via Web services. WS-SecureConversation describes a framework to establish security around multiple messages between organizations. And WS-SecurityPolicy outlines general security policies of Web services. WS-Policy, WS-PolicyAttachment and WS-PolicyAssertions are proposed standards around implementing business policies for Web services, according to the companies. BEA, IBM, Microsoft and SAP co-authored these three specs... Edward Cobb, BEA's vice president of architecture and standards and a co-author of the WS-Policy spec, said that the new proposed standards show that vendors are committed to hastening Web services adoption by offering standards to solve security issues. '[The specs] promote a common industry goal to help speed the adoption of Web services by delivering secure, reliable interoperability guidelines that span platforms, applications and programming languages,' Cobb said..." Details and references in the 2002-12-18 news item: "Microsoft and IBM Publish Six New Web Services Security and Policy Specifications."

  • [December 18, 2002] "Group Forms to Study Tax-Related XML Standards." By Patricia Daukantas. In Government Computer News (December 18, 2002). "The IRS and tax officials from other countries have formed a technical committee to devise an international open standard for exchanging tax data via Extensible Markup Language. The Tax XML Technical Committee sprang from growing requests to improve methods for countries to exchange tax data electronically, said Gregory Carson, director of electronic tax administration modernization for the IRS' Wage and Investment Operating Division. Carson is serving as interim chairman of the fledgling group. The technical committee was formed under the auspices of an electronic-business standards consortium, the Organization for the Advancement of Structured Information Standards of Billerica, Mass. As interest in XML and related Web-services standards has grown, OASIS has been organizing new committees to meet the demand, including one on e-government standards. For several years IRS officials have worked with the Federation of Tax Administrators, a group representing revenue officials in the 50 states and several other jurisdictions, and with the electronic data interchange group of the American National Standards Institute, Carson said. Much of the data IRS exchanges with its foreign counterparts still gets shipped on paper or magnetic media, instead of over networks. In response to that, the IRS invited about 30 participants to a three-day meeting last spring in Williamsburg, VA., Carson said. The meeting encouraged attendees to work toward establishing a formal organization to study data-exchange standards for tax agencies. The Tax XML Technical Committee attracted about 50 participants, both in person and via teleconference, to its first meeting last week at the XML 2002 conference in Baltimore, Carson said..." See: (1) details in "OASIS Members Form Tax XML Technical Committee"; (2) general references in "XML Markup Languages for Tax Information."

  • [December 18, 2002] "Web Services for Remote Portals (WSRP) Whitepaper." By Thomas Schaeck (IBM; Chair, OASIS WSRP Technical Committee). 22-September-2002. 18 pages. "Integration of content and application into portals has been a task requiring significant custom programming effort. Typically, portal vendors or organizations running portals had to write special adapters to allow portals to communicate with applications and content providers to accommodate the variety of different interfaces and protocols those providers used. The OASIS Web Services for Remote Portals (WSRP) standard simplifies integration of remote applications and content into portals to the degree were portal administrators can pick from a rich choice of remote content and applications and integrate it in their portal simply with a few mouse clicks, without programming effort. As a result, WSRP becomes the means for content and application providers to provide their services to organizations running portals in a very easily consumable form. To achieve this, the WSRP standard defines pluggable, user-facing, interactive web services with a common, well-defined interface and protocol for processing user interactions and providing presentation fragments suited for mediation and aggregation by portals as well as conventions for publishing, finding and binding such services. By virtue of the common, well-defined WSRP interfaces, all web services that implement WSRP plug into all WSRP compliant portals without requiring any service specific adapters -- a single, generic adapter on the portal side is sufficient to integrate any WSRP service. WSRP standardizes web services at the presentation layer on top of the existing web services stack, builds on the existing web services standards and will leverage additional web services standards efforts, such as security efforts now underway, as they become available. The WSRP interfaces are defined in the Web Services Description Language (WSDL). In addition, WSRP defines metadata for self-description for publishing and finding WSRP services in registries. All WSRP services are required to implement a SOAP binding and optionally may support additional bindings. The OASIS WSRP standard will enable thousands of portals to aggregate content from tens of thousands of content and application providers offering hundreds of thousands of user-facing, pluggable web services for millions of end users/devices..." See also from 2002-12-18: "Updated IBM Web Services Toolkit Supports WSRP Specification."

  • [December 18, 2002] "The RDDL Challenge." By Micah Dubinko. From XMLhack (December 17, 2002). "On behalf of the W3C TAG [Technical Architecture Group], Tim Bray asked the XML community for examples of what documentation at the end of a namespace should look like, using RDDL (Resource Directory Description Language) as a starting point. Several interesting alternatives have been proposed. The challenge was to describe a vocabulary based on XHTML that would be able to express a few basic pieces of information: 'I'm surprised at the lack of response to date. The amount of namespace-bearing XML in the world is increasing at a very high rate, and that rate will accelerate dramatically sometime next year with the release of MS Office 11. I think it's important that we get some consensus as to what those namespace canusefully point at, to provide some interoperability in the marketplace. So, here's my proposal. Give up on XLink because this is not an end-user-oriented browsing application, and because using "role" and "arcrole" for Nature and Purpose is a kludge. Give up on RDF because there is a poor match between RDDL's goals (dereference a URI and use the results to look up other URIs in based on nature & purpose) and RDF's goals (building an inference-capable network of assertions about everything). I propose...'" See also Examples of RDDL in RDF and the list archives. General references in "Resource Directory Description Language (RDDL)."

  • [December 18, 2002] "Supply Chain Management Use Case Model." Edited by Scott Anderson (Visuale, Inc.), Martin Chapman (Oracle), Marc Goodner (SAP), Paul Mackinaw (Accenture), and Rimas Rekasius (IBM). WS-I Working Group Draft. Date: 2002/11/10. 28 pages. ['This is not a final document. This is an interim draft published for early review and comment.'] "This document presents a high level definition of a Supply Chain Management (SCM) application in the form of a set of Use Cases. The application being modeled is that of a Retailer offering Consumer electronic goods to Consumers; a typical B2C model. To fulfill orders the Retailer has to manage stock levels in warehouses. When an item in stock falls below a certain threshold, the Retailer must restock the item from the relevant Manufacturer's inventory (a typical B2B model). In order to fulfill a Retailer's request a Manufacturer may have to execute a production run to build the finished goods. In the real world, a Manufacturer would have to order the component parts from its suppliers. For simplicity in this application, we assume this is a manual process which is supported through the use of fax. Each use case includes a logging call to a monitoring system in order to monitor the activities of the services from a single monitoring service. The primary goal of the application is to demonstrate all of the scenarios in the WS-I Basic Profile..." See: "Web Services Interoperability Organization (WS-I)." [cache]

  • [December 18, 2002] "Publish and Find UDDI tModels with JAXR and WSDL." By Frank Sommers. In JavaWorld (December 13, 2002). ['This article presents a programming model for publishing and discovering Web services based on service interfaces. It starts by defining reusable WSDL (Web Services Description Language) interface documents and shows how to register those interfaces as UDDI (Universal Description, Discovery, and Integration) tModels using the Java API for XML Registries (JAXR). Then the article focuses on how Web service clients use well-known tModels to discover and invoke services that adhere to a set of interfaces.'] "... The pattern of finding all implementations of a Web service interface and possibly invoking those service instances proves useful in other contexts as well. Portal Websites still rely on manual -- or semi-manual -- compilation of news articles, automobile inventories, available hotel rooms, or airline seats. Even when data exchanges electronically, that automation often comes at the expense of lengthy and pricey system integration. Among the biggest motivations for building Web services is the desire to automate those tedious information-gathering tasks. This article provides a working example of how UDDI (Universal Description, Discovery, and Integration) registries, the Java API for XML Registries (JAXR), and the Web Services Description Language (WSDL) work together to initiate that automation. Currently, several industry groups are working to define Web service interface standards. Examples are the Open Travel Alliance for travel, the Star Consortium for automotive retail, and RosettaNet for supply chain management. Many of those groups employ a community-oriented process and make their specifications available to anyone for comments. Real-world interface specifications aim to be comprehensive and are rather complex. Thus, to make this article easy to follow, I use a simple interface definition for the example cruise ship destination Web service. That interface features only a single method. When invoked, that method produces a list of destinations a cruise company serves..." See: (1) "Web Services Description Language (WSDL)"; (2) "Java API for XML Registries (JAXR)."

  • [December 18, 2002] "The XML Papers: Lessons on Applying Topic Maps." By Steve Pepper and Lars Marius Garshol (Ontopia). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. "This paper describes some of the basic steps in applying topic maps in a real world application, a topic map-driven web portal of conference papers. It covers the tasks of collecting and examining source data, defining an ontology and populating a topic map. It discusses tools for automatically creating topic maps, with particular emphasis on how the synergies between topic maps and RDF can be exploited in the process of autogenerating topic maps from structured and semi-structured source data. It also provides an introduction to the concept of published subjects, describes how they are being applied in this project and details the benefits that they are expected to bring in both this and other projects... We will briefly describe the goals of XML Papers project and then concentrate on the work that has actually been performed to date, paying special attention to methodologies, technologies and the lessons we have learned. The project is not yet complete, although a substantial topic map and application already exists that covers a dozen or so conferences. That application will be demonstrated during the presentation and the conference exhibition. The idea of producing a 'next generation' topic map of not just one GCA conference, but all of them was conceived by the present authors and embraced by IDEAlliance (the GCA's successor). The goal of the project is to collate (as much as possible of) a decade's papers on XML and related technologies, index them using topic maps, and make them accessible through a topic map-driven web portal. A secondary goal is to provide input to the XMLvoc technical committee working on defining published subjects for the domain of XML. The XMLvoc TC [Vocabulary for XML Standards and Technologies TC] is one of several committees working under the auspices of OASIS in the area of published subjects, which are described later..." General references: "(XML) Topic Maps."

  • [December 18, 2002] "The Economics of the Topic Maps Reference Model." By Steven R. Newcomb (Coolheads Consulting). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. "The draft ISO standard Reference Model of the Topic Maps paradigm meets technical requirements that are strikingly parallel to the same economic principles that are evidently most conducive to worldwide economic growth... The Topic Maps Reference Model is basically a distinction -- and a well-defined boundary -- between two things: (1) The minimum set of structural features that knowledge must always be regarded as having, if we are to have a convenient way of aggregating any kinds of independently-maintained knowledge with any other kinds of independently-maintained knowledge, in a lossless, predictable, useful, and affordable fashion. (2) All the other features that knowledge can have, including those that may need to be constrained in diverse ways within diverse contexts. The Topic Maps Reference Model takes the position that the minimum set of structural features that must be common to all knowledge, in order to allow all kinds of knowledge to be aggregated, are a set of constraints on the structure of semantic networks. The kind of semantic network that is defined by the Reference Model is called a 'topic map graph'. In a topic map graph, every node is a surrogate for exactly one subject (as in 'subject of conversation'), and no two nodes are surrogates for the same subject. All nodes are connected to each other by nondirectional arcs. The Reference Model provides exactly four kinds of arcs, each of which is used in the same very specific way in each 'assertion'. In the Reference Model, assertions are the primary units of knowledge. Every assertion is a set of specific nodes interconnected in specific ways by specific kinds of arcs. Assertions represent relationships between subjects, and in each assertion, each related subject plays a specific role (called a 'role type'), which is itself a subject. Each assertion can itself be an instance of an 'assertion type', which is also a subject..." See: (1) "ISO SC34 Publishes Draft Reference Model for Topic Maps (RM4TM)."; (2) general references: "(XML) Topic Maps."

  • [December 18, 2002] "Topic Map Authoring With Reusable Ontologies and Automated Knowledge Mining." By Joshua Reynolds and W. Eliot Kimber (ISOGEN International, LLC). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. "Topic Maps and their supporting infrastructure are quickly achieving the level of maturity needed to make them useful as part of the basic information management toolkit. With increasing vendor support, standardization activities, and interest in the field of Knowledge Representation and Interchange, it is clear that Topic Maps are here to stay. Unfortunately all of this progress and interest in no way eases the formidable task of authoring Topic Maps. Our experience indicates that XSLT works well for Topic Map generation over sets of XML resources. Markup, through it's design and implementation, frequently captures a good deal of semantic information, making it a perfect candidate for knowledge extraction. There are essentially two ways of extracting that knowledge into a Topic Map when those marked-up resources conform to a known schema (DTD, RELAX-NG, XSD, or even just an in-house convention). The first is hand authoring. This involves reading the document and using human reasoning to interpret the markup and it's content, then creating the Topic Map from this information. The second is to use the schema itself. By applying knowledge extraction techniques to the schema, we can use the same logic across an arbitrarily large set of conforming documents. As markup is easily machine processed, incorporating this reasoning in some sort of algorithmic form is clearly desirable. Going from markup (XML) to markup (XTM) makes XSLT the prime candidate for expressing this algorithm. Topic Map merging enables these generated XTMs to be combined with topical information that can't be extracted using a style-sheet. Although the former allows for more precision, the latter implies far less cost, both in terms of initial effort, as well as maintenance (only the style-sheet must be authored/maintained). This paper provides a case study used to illustrate how to ease the task of Topic Map creation through a multi-stage modularized process. The first stage is hand authoring a relatively invariant 'ontology' Topic Map. This consists of defining the ontology of types and associations that capture the data model for a particular subject domain. The assumption is that this ontology would be relatively stable over time, and a good candidate for reuse. The second is generating additional Topic Maps through an algorithmic process (XSLT) applied to XML document instances. The third is hand authoring those things not captured in the first two stages. This consists of the capture of information not directly discernible from the markup, or stored in non-XML resources. The resultant Topic Maps are merged giving a Topic Map that can be as rich as if completely hand authored. We present the source documents and code (stylesheets) used in an exploratory implementation of this approach, and lay out a more generalized approach to using this methodology. We finish by identifying possible issues with this approach, as well as enumerating alternatives, and stating the conclusions we were able to infer from our exploration..." General references: "(XML) Topic Maps."

  • [December 18, 2002] "Using DAML+OIL as a Constraint Language for Topic Maps." By Eric Freese (LexisNexis). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. "Over the past year or so, the World Wide Web consortium (W3C) and DARPA have been working to create a framework to model information ontologies contained on the Web. The result of that work is known as DARPA Agent Markup Language + Ontology Interface Layer (DAML+OIL). DAML+OIL provides a rich set of constructs, using RDF, to create ontologies and mark up information to be machine readable and processable. Some of the constructs are much more powerful than what is currently enabled by the topic map model. These include: (1) The ability to define not only subclass-superclass relationships but also disjoint relationships; (2) The ability to place restrictions on when specific relationships are applicable; (3) The ability to apply cardinality to relationships. The topic map standard provides several features that RDF cannot match, especially in its association model and the ability to define scopes for information (even though there is still discussion about how scope should really work). While certain inferences about how objects are related can be derived from a topic map, DAML+OIL has extended RDF to do things topic maps can't. The constructs mentioned above would be very useful in topic maps to allow intelligent inferences about objects (whether or not they are topics or resources) and to accurately build a knowledge base from a set of information, be it a small corporate document repository or the entire Web. This paper will demonstrate how DAML+OIL can be used to provide additional capabilities that are currently missing from the topic map model. It will discuss possible additions to the topic map model or its companion standard, Topic Map Constraint Language (TMCL) [REQs] , to enable DAML+OIL to process and enhance topic maps. It will also discuss methods for using DAML+OIL in conjunction with topic maps to take advantage of the best from both worlds..." General references: "(XML) Topic Maps."

  • [December 18, 2002] "Topic Maps: Backbone of Content Intelligence." By Jean Delahousse (Mondeca). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. "Industrial enterprises, administrations and editors have a long history of organizing information so as to facilitate access and exchange. Developed for the world of print publication, these earlier solutions need to be incorporated now into the newer ones that are specifically adapted for digital content. The term 'Content Intelligence' encompasses the different tools and methods that would allow companies to capitalize on their existing skills and resources even while offering fresh solutions for new stakes... Topic Maps tools are ideally suited for providing the infrastructure to content intelligence solutions because of: the separation between subject management, subject organization and contents; (2) the ability to implement decentralized interoperable solutions; (3) the ability to reuse and exploit the existing information organization within the enterprises. These strengths explain the proliferation of operational projects based on the Topic Maps standard and tools within various industries, the publishing world and in administration... Several elements are still missing in the Topic Maps standard and its derivative software for the solutions implemented to be complete and interoperable: the finalization of the PSI standard a standardization, within the Topic Maps framework, of the most commonly used models of knowledge organization so as to ensure interoperability of the solutions implemented the definition of generic APIs, Web services and standards that would together facilitate the emergence of a dialogue between the different software components of a Content Intelligence solution But, as of today, these different challenges have already been identified by the different actors and are progressively finding solutions within the framework of standardization committees, operational projects for clients and industrial collaborations among the different providers of technical solutions..." General references: "(XML) Topic Maps."

  • [December 18, 2002] "Articulating Conceptual Spaces Using the Topic Map Standard." By Helka Folch and Benoît Habert (Human-Machine Communication Department - LIMSI CNRS). Presentation given at the IDEAlliance XML 2002 Conference, Baltimore, December 2002. "Topic Maps (ISO13250) is a powerful standard for the semantic annotation of document collections... However the construction of a topic map can be very costly and can quickly become a bottleneck in any large-scale application if recourse is not made to automatic methods. Apart from the initial cost of defining the topic map, problems of maintenance and coherence may arise when the topic map is applied to renewable information sources, as manual construction and maintenance can not keep pace with any significant amount of incoming documents. A manual approach to topic map construction is adapted if the conceptual model is stable and is linked to a circumscribed collection of resources, but is ill suited to manage dynamic, loosely-structured information sources. The volume of data channeled through the Internet and large intranets today requires not only indexing new resources 'on the fly' but re-structuring the semantic model as new concepts and varying points of view spring up. One possible approach to topic map construction involves recycling of structured or semi-structured data and exploiting pre-existing knowledge sources for semantic indexing. This implies that a semantic model is available in advance, to which information instances are attached in a top-down manner. However, the large majority of electronic resources available today are unstructured. This has generated a need to extract semantic information and build topic maps from unrestricted text for which the vocabulary and conceptual models are not known in advance. In contrast to a top-down approach, in this latter data-driven approach a document collection is not a passive repository of topic instances but rather a tool for discovery from which semantic categories are made to emerge through inductive methods. This approach has been applied within the context of 'monitor corpora' such as the electronic version of the Wall Street Journal. The periodical analysis of these corpora is aimed at monitoring a given domain in view of detecting emerging topics, for a technology watch task, for instance. The work we present in this paper, describes an inductive method for building topic maps from unrestricted text. Our approach does not use pre-existing knowledge sources, but rather exploits regularities of word pattern distribution within a collection of documents..." General references: "(XML) Topic Maps."

  • [December 17, 2002] "W3C Finalizes Disability Guidelines." By Paul Festa. In CNET (December 17, 2002). "Bringing a five-year project to a significant milestone, the World Wide Web Consortium finalized guidelines for building browsers and media players that work better for people with disabilities. The W3C's recommendation of its User Agent Accessibility Guidelines (UAAG) 1.0 brings to completion the third guideline document under development by the group's Web Accessibility Initiative (WAI). Two other sets of guidelines, already finalized, deal with the creation of authoring tools, recommended in February 2000, and Web pages, recommended in March 1999... And now the WAI is thinking about tacking on a fourth set of guidelines to the existing trio. A working group is drafting XML Accessibility Guidelines, which will suggest how to make XML applications more accessible. That work could wind up as its own document, or be integrated with existing guidelines, according to the WAI. The user agent guidelines released Tuesday urge designers to make a number of accommodations for disabled users. For example, the guidelines suggest that designers make commands executable through the keyboard, as well as the mouse. The guidelines ask that designers make their applications work smoothly with so-called assistive technologies, like screen readers or refreshable Braille output..." See details in the 2002-12-17 news item "W3C Publishes User Agent Accessibility Guidelines 1.0 as a Recommendation."

  • [December 17, 2002] "Standardizing VoiceXML Generation Tools." By David L. Thomson. In VoiceXML Review (December 2002). "An area where we have an opportunity to make VoiceXML easier to use and more portable is in development and runtime tools. VoiceXML provides two significant advantages in authoring speech-enabled applications, when compared to previous methods. It allows a developer to build speech services with less effort and it allows applications written for one speech platform to run on another speech platform. These advantages are diminished, however, if software tools used to create and support VoiceXML code are inadequate or incompatible. The VoiceXML Tools Committee, under the direction of the VoiceXML Forum, has been working on methods for improving the quality and uniformity of tools as described below. To define a process for improvement, we must first outline an architecture that illustrates how tools are connected. Companies currently building tools include application developers, speech server suppliers, speech engine vendors, speech hosting service bureaus, stand-alone tool developers, and customers... Development tools and runtime software on the VoiceXML page server must use the same meta language. Since the meta language is generally unique to a given tool vendor, runtime software on the VoiceXML page server will only work with development tools from the same vendor... the VoiceXML Tools Committee is studying ways to standardize the meta language. Vendors would then use the standard meta language to represent parameters of the call flow, even if vendor tools otherwise provide different features. Two proposals under consideration are: (1) the XForms standard under development by the W3C and (2) an XML-based standard where styles sheets convert between formats used by different vendors. This rather ambitious goal will, if successful, improve the interoperability of development and runtime tools and make applications portable across vendors... Tools for developing VoiceXML-based speech applications are a critical factor in making VoiceXML easy to use. While VoiceXML itself may be well-defined, industry software for generating VoiceXML code lacks uniformity. We have launched an effort to define two standards that will help VoiceXML systems interoperate across different vendors. The effort will define how applications are represented and how runtime data is transported and stored. We hope that this effort will foster the creation of better tools and make developing VoiceXML services faster and easier..." See "VoiceXML Forum."

  • [December 17, 2002] "Enhancing VoiceXML Application Performance By Caching." By Dave Burke. In VoiceXML Review (December 2002). "The VoiceXML architectural model specifies a partitioning of application hosting, and application rendering. Specifically, the application is served from a Web Server and is typically created dynamically within the framework of an Application Server or equivalent. The VoiceXML Interpreter renders the resultant VoiceXML document, transmitted across a network by HTTP, into a series of instructions interpreted by the Implementation Platform. Implied in this model is a geographical distribution of the application hosting environment and the VoiceXML platform and thus the incursion of network latencies. An application might make many subsequent requests for new VoiceXML documents during its lifetime and thus these latencies may have considerable adverse effects on performance. In this article we will discuss how caching can be used to enhance the performance of VoiceXML applications. Caching is a strategy for storing temporary 'objects' (e.g., VoiceXML resources) local to the VoiceXML Interpreter that can be employed by the application developer for optimising these latencies. In what follows we will use the phrase 'origin server' to denote the application hosting environment, and 'user agent' to refer to the VoiceXML Interpreter and Implementation Platform... HTTP caching provides a powerful mechanism for improving performance of applications. A performant VoiceXML application that yields customer satisfaction will promote customer retention and also save money on deployment costs. Caching is often poorly understood and under-utilised on the Internet, yet can be effectively harnessed by observing some simple practices as outlined in this article..." See "VoiceXML Forum."

  • [December 17, 2002] "OASIS Seeks Web Services Standards for Language Translation." By Paul Krill. In InfoWorld (December 17, 2002). "The OASIS (Organization for the Advancement of Structured Information Standards) consortium on Tuesday announced that members have formed a technical committee to develop standards for Web services-based language translation services. The OASIS Translation Web Services Technical Committee will develop a Web service definition language for interactions between publishers of content who need it translated into different languages and the translation vendors, said Patrick Gannon, president and CEO of OASIS, in Billerica, Mass. Participating in the effort are DataPower, IBM, the Localisation Research Centre, Microsoft, Oracle, SAP, and others in a collaboration to use Web services for a workflow linking tasks that comprise a complex software localization process, according to OASIS. 'What [the committee wants] is a standard way of interfacing between those who are requesting the service and those who are providing the services,' Gannon said. The mechanism for the translation will be a set of standard WSDL packets, he said. Gannon cited software companies as a potential benefactor, with these companies requiring documentation be translated into multiple languages. The translation is part of an effort in which various industries each need to agree on a standard set of Web services definitions, Gannon said..." See: (1) the 2002-12-17 press release: "OASIS Members to Develop Web Services Standard for Translation. DataPower, IBM, Microsoft, Oracle, SAP, and Others Collaborate on Localization Specification."; (2) the news item of 2002-11-26: "New OASIS Translation Web Services Technical Committee."

  • [December 17, 2002] "XBRL: Still A Ways Away From Saving the Day." By Eileen Colkin Cuneo. In InformationWeek (December 17, 2002). "...But the standard is seen as an important part of restoring consumer and investor confidence in Big Business... Sixty percent of senior execs at financial institutions believe that trust in their industry has been eroded by the corporate scandals of the past year, according to surveys and interviews by PricewaterhouseCoopers and The Economist Intelligence Unit, an Economist Group business. To help cure the problem, the two organizations have presented a five-step recovery plan for financial institutions suffering from an erosion of public trust, including a strong endorsement for the Extensible Business Reporting Language, or XBRL. The financial-services vertical market, which stands to lose the most from accounting scandals, needs to move forward on Internet-based reporting standards, the report says. Pushing the use of XBRL, a programming language that uses standard data tags to identify specific information in financial reports, would help the financial-services industry in several ways. The language would make it easier for investors and regulators to navigate financial statements and more difficult for executives to hide financial information in footnotes, thus increasing the integrity of the reports. The standard also could help move the financial-reporting process toward real-time disclosure. Already companies such as Microsoft and Morgan Stanley are using the language for their financial statements... XBRL needs widespread adoption. That isn't immediately likely, as executives still know little about the language. According to the survey, only 42% of financial executives believe XBRL will make reports more useful, while 47% say they don't know what role XBRL could play..." See: "Extensible Business Reporting Language (XBRL)."

  • [December 17, 2002] "Creative Types: A Lot in Common." By Kendra Mayfield. In Wired News (December 16, 2002). "Roger McGuinn, founder of legendary folk-rock band The Byrds, has made over 25 albums in his recording career. But besides modest advances, he's never made money on record royalties. When McGuinn decided to record new versions of traditional folk songs, he made these songs available as MP3s for free download on his website and on He's since made 'thousands of dollars' from the sale of these recordings. Like McGuinn, many artists are turning to the Web to maximize exposure, yet retain some control over their work. McGuinn is just one of the artists who will publish works under a new set of licenses that offer an alternative to conventional copyright. On Monday, Creative Commons will release its collection of free, machine-readable licenses. The idea is to give copyright holders another way to get the word out that their works are free for copying and other uses under specific conditions... While industry organizations like the Recording Industry Association of America try to curtail the distribution of copyrighted works, Creative Commons' licenses will complement existing efforts to make online sharing and collaboration easier. Open-source movements like the Free Software Foundation's General Public License inspired the Creative Commons' model... Creators can decide whether they want to release their work into the public domain or license it with one of Creative Commons' custom licenses. Authors can use Creative Commons' site to dedicate their works to the public domain without restriction, declaring 'no rights reserved,' Brown said. Alternatively, they can select among four different licensing tools, or they can mix and match preferences. Each license grants the public the right to copy and redistribute a work freely, but under certain conditions..." See technical details on the use of RDF/Dublin Core in the 2002-12-16 news item: "Creative Commons Project Offers RDF-Based Licenses for Rights Expression."

  • [December 17, 2002] "Expressing Simple Dublin Core in RDF/XML." Dublin Core Metadata Initiative Recommendation, Approved 2002-10-25. Edited by Dave Beckett (Institute for Learning and Research Technology - ILRT, University of Bristol), Eric Miller (W3C), and Dan Brickley (W3C/ILRT). Latest version URL: Draft issued 2002-07-31. In October 2002 the DCMI Directorate announced that the document Expressing Simple Dublin Core in RDF/XML had been approved as a DCMI Recommendation. "This is the first in a series of recommendations for encoding Dublin Core metadata using mainstream Web technologies. Other guidelines that are in process at this moment are guidelines for the expression of Qualified Dublin Core in RDF/XML, guidelines for implementing Dublin Core in XML and an updated version of the guidelines for expressing qualified Dublin Core in HTML/XHTML meta elements." ['This document explains how to encode the DCMES in RDF/XML, provides a DTD to validate the documents and describes a method to link them from web pages.'] From the Introduction: "The Dublin Core Metadata Element Set V1.1 (DCMES) can be represented in many syntax formats. This document gives an encoding for the DCMES in XML using simple RDF, provides a DTD and W3C XML Schemas to validate the documents and describes a method to link them from web pages. This document describes an encoding for the DCMES in XML subject to these restrictions: (1) The Dublin Core elements described in the DCMES V1.1 reference can be used; (2) No other elements can be used; (3) No element qualifiers can be used; (4) The resulting RDF/XML cannot be embedded in web pages. The primary goal for this document is to provide a simple encoding, where there are no extra elements, qualifiers, optional or varying parts allowed. This allows the resulting data to be validated against a DTD and guaranteed usable by XML parsers. A secondary goal was to make the encoding also be valid RDF which allows the document to be manipulated using the RDF model. We have tried to limit the RDF constructs to the minimum, and the result is a standard header and footer for every document..." See: "Dublin Core Metadata Initiative (DCMI)."

  • [December 17, 2002] "'Office 11' for Developers: Build Solutions with XML, Smart Tags, and Visual Studio." From Microsoft 'Office XP for Developers'. December 09, 2002 or later. Referenced in XML-DEV 2002-12-17 by Dare Obasanjo. "'Office 11,' the code name for the next release of Microsoft Office, helps enable developers to create intelligent business solutions that address today's demanding business requirements while giving information workers a powerful user interface. Using the support for customer-defined Extensible Markup Language (XML) schemas and XML Web services in Office 11, developers can more easily build documents and applications that connect with business processes and data. In addition, new tools in Office 11 help you build managed code for the Microsoft .NET Framework and take advantage of the ease and security of deploying solutions from a server. This page previews some of the Office 11 technologies and tools that will help you create rich, custom solutions..." (1) Word: "Rich XML programmability in Office 11, including support for Extensible Stylesheet Language (XSL) and XPath, allows developers to build solutions that capture and reuse document content across applications, processes, devices, and platforms. XML support enables Word 11 to function as a smart client for XML Web services and a host for smart document solutions..." (2) Excel: "Spreadsheets in 'Excel 11,' the code name for the next version of Microsoft Excel, can be designed with an underlying XML structure, including support for virtually any industry-standard or customer-defined XML schema. By defining schemas, businesses can implement more flexible connections between the desktop and data stored on servers to help better meet the needs of diverse sets of users..." (3) "The new Visual Studio Tools for Office' will enable developers to build a new generation of business solutions on Word 11 and Excel 11..." (4) Smart Documents: "Office 11 smart document technology will enable developers to create solutions that give users more useful, contextual, and customized content in the Office task pane. Developers can create these document-based solutions by using underlying XML schemas -- which define the structure of a Word 11 or Excel 11 document -- and a custom DLL...Smart documents can also automatically update themselves from a trusted server location. Developers never have to install or manage the code directly on the computer..." (5) Smart Tags, Version 2: "Office 11 improves on the smart tag feature introduced in Office XP and enables developers to create more flexible, powerful smart tag solutions... Smart tag functionality has also been improved to include the capability to execute actions immediately on recognition, without requiring user intervention. For example, a custom smart tag could recognize a product name and automatically start an action that turns the text into a link to a related document or Web page..." Dare Obasanjo's note: "For those of you who couldn't be at XML 2002, you can now check out what all the buzz about Office 11's XML support is about first hand. The link below provides screenshots of Excel 11 and Word 11, white papers describing the XML features of Office 11 for developers and business decision makers, and a preview of Visual Studio Tools for Office." See: "Microsoft Office 11 and XDocs."

  • [December 16, 2002] "Web Services Security Core Specification." Produced by the OASIS Web Services Security TC (WSS). Working Draft version 08. 12-December-2002. Document identifier: WSS-Core-08. 56 pages. Edited by Phillip Hallam-Baker (VeriSign), Chris Kaler (Microsoft), Ronald Monzillo (Sun), and Anthony Nadalin (IBM). 'An updated draft of WSS-Core to address action items that occurred in the F2F as of 12/12/2002, posted to the TC mailing list by Anthony Nadalin 2002-12-16. " This specification proposes a standard set of SOAP extensions that can be used when building secure Web services to implement message level integrity and confidentiality. This specification refers to this set of extensions as the 'Web Services Security Core Language' or 'WSS-Core'. This specification is flexible and is designed to be used as the basis for securing Web services within a wide variety of security models including PKI, Kerberos, and SSL. Specifically, this specification provides support for multiple security token formats, multiple trust domains, multiple signature formats, and multiple encryption technologies. The token formats and semantics for using these are defined in the associated binding documents. This specification provides three main mechanisms: ability to send security token as part of a message, message integrity, and message confidentiality. These mechanisms by themselves do not provide a complete security solution for Web services. Instead, this specification is a building block that can be used in conjunction with other Web service extensions and higher-level application-specific protocols to accommodate a wide variety of security models and security technologies. These mechanisms can be used independently (e.g., to pass a security token) or in a tightly coupled manner (e.g., signing and encrypting a message and providing a security token path associated with the keys used for signing and encryption)..." "Web Services Security Specification (WS-Security)."

  • [December 16, 2002] "Microsoft Office Embraces XML." By Eric van der Vlist. From XMLhack (December 16, 2002). "For many participants, the most memorable event of XML 2002 will be Jean Paoli's presentation of Office 11, which promises to deliver easier access to XML for hundreds of millions of work stations... The impression given in the presentation by Jean Paoli, co-editor of the XML 1.0 recommendation and pilot of the 'XMLization' of Office, is that Microsoft is doing with XML what worked with the Internet. XML was already well-supported by Microsoft BackOffice products from Biztalk to CMS through XML Server, but a key piece was missing - tools to let users to manipulate XML on their work stations. The issue of editing XML documents is still as difficult as the connection to the Internet in the early 90s: technically there is a solution and many XML editors are available, but the financial, organizational and human impact of deploying these tools on a large scale is considered a major obstacle by many organizations. 'XML for the masses' is the target of Office 11 and the presentation suggests that Microsoft will likely meet the target. Without major innovations (except maybe the linking of XML documents to Excel spreadsheets using XPointer references), Office 11 appears to be doing what has been announced using largely standard technologies, and to enabling the manipulation of arbitrary XML documents using customer-chosen schemas... Often criticized for their 'embrace and extend' strategy, Microsoft has finally decided to continue to play the game with XML even though the extensibility of XML opens new and unpredictable possibilities. But they need to control all the major market segments in fear that XML might give to the masses the possibility of emancipation from Microsoft's domination. While the deployment of XML on millions of work stations is good news in the short term, it will certainly modify the landscape..." See: "Microsoft Office 11 and XDocs."

  • [December 16, 2002] "Increased Attendance at IDEAlliance XML Conference & Exposition 2002." By Dianne Kennedy. Special Report from the IDEAlliance XML Files (December 2002). With photos. "Despite the lagging economy, the IDEAlliance XML 2002 Conference saw a dramatic increase in attendance. 1,275 attendees filled the Baltimore Convention Center this week to attend tutorials, hear the latest announcements from software vendors and learn from fellow delegates. This year's conference, Putting the Pieces Together was chaired by Dr. Lauren Wood, a member of the advisory council for the World Wide Web Consortium. XML Conference & Exposition 2002 is the latest in the IDEAlliance XML conference series, the largest, and longest-running annual gathering of XML users and developers in the world..."

  • [December 16, 2002] "Security Audit and Access Accountability Message Data Definitions for Healthcare Applications." By Glen Marshall (Siemens Medical Solutions Health Services). IETF Internet Draft. Reference: 'draft-marshall-security-audit-00.txt'. December 2002, expires April 2003. XML Schema in Section 6, pages 24-32. "This document defines data for privacy and security policy assurance applications to be output from healthcare application systems. It supplements existing system-specific security audits with healthcare application-specific requirements. It also anticipates the existence of common repository systems that receive audit data from multiple application systems and the associated operating infrastructure components." Goals of the document are to: "(1) Define data to be communicated for evidence of compliance with, and violations of, a healthcare enterprise's security and privacy policies; (2) Depict the data that would reside in a common audit engine or database; (3) Allow useful queries against audited events; (4) Provide a common reference standard for healthcare IT standards development organizations. This document consolidates previously disjoint viewpoints from Health Level 7 (HL7), Integrating the Healthcare Enterprise (IHE), and the ASTM International Healthcare Informatics Technical Committee (ASTM E31). It is intended as a reference for these three groups and other healthcare standards developers... Security subsystems found in most system infrastructures include a capability to capture system-level security relevant events like logon and security object accesses. We assume such functions are enabled and capable of recording and supplying the data defined in this document, although transformation to conform to the common XML schema definition will be required..." [cache]

  • [December 16, 2002] "A Quantitative Analysis of Dublin Core Metadata Element Set (DCMES) Usage in Data Providers Registered with the Open Archives Initiative (OAI)." By Jewel Ward (Graduate Student, The School of Information and Library Science, University of North Carolina at Chapel Hill; WWW). November, 2002. 68 pages. A Master's paper for the MS degree in I.S. Abstract: "This research describes an empirical study of how the Dublin Core Metadata Element Set (DCMES) is used by 100 Data Providers (DPs) registered with the Open Archives Initiative (OAI). The research was conducted to determine whether or not the DCMES is used to its full capabilities. Eighty-two of 100 DPs have metadata records available for analysis. DCMES usage varies by type of DP. The average number of Dublin Core elements per record is eight, with an average of 91,785 Dublin Core elements used per DP. Five of the 15 elements of the DCMES are used 71% of the time. The results show the DCMES is not used to its fullest extent within DPs registered with OAI..." See: (1) "Dublin Core Metadata Initiative (DCMI)"; (2) "Open Archives Metadata Set (OAMS)."

  • [December 16, 2002] "Microsoft Highlights Security in Web Services Development Pack." By Paul Krill. In InfoWorld (December 16, 2002). "Microsoft on Monday [2002-12-16] is officially unveiling the general release of Web Services Enhancements 1.0 (WSE) for Microsoft .Net, focusing on security and support of proposed standards. The free product, which had been available in an unsupported technical preview format since August, is a plug-in to Microsoft's Visual Studio .Net. It is built on existing Web services standards such as XML, SOAP (simple Object Access Protocol) and WSDL (Web Services Description Language). New in the general release is an API for log-in, session handling and audit trails, but Microsoft officials are focusing particularly on support for the proposed WS-Security standard now under consideration by OASIS (Organization for the Advancement of Structured Information Standards). WS-Security was first published by Microsoft, IBM and Verisign in April. Through use of WS-Security, developers can build more secure Web services, Microsoft officials stressed. "It secures the actual message and payload," said Rebecca Dias, product manager for Web services marketing at Microsoft, in Redmond, Wash. Security is maintained when messages are passed to multiple points, she said. "What we're doing is allowing you to sign that part of data that's flowing over the Internet," Dias said. Other features included as part of WS-Security support include backing for x.509 digital certificates and Kerberos, to encrypt messages... Microsoft also is featuring support for the Microsoft-proposed WS-Routing specification. This enables messages to be sent through intermediaries and across different transports, to leverage load balancing as well as geographic balancing, to enable a local server to process Web services. Also featured is support for the WS-Attachments proposal now sitting before the IETF (Internet Engineering Task Force). This support together with backing of DIME (Direct Internet Message Encapsulation) enables binary files to be sent via SOAP-based transports..." See: (1) the announcement "Microsoft Delivers Latest Developer Tools for Building Advanced, More Secure Web Services. Web Services Enhancements 1.0 Available for Visual Studio .NET. Developers, With Support From Companies Such as F5 Networks, WestGlobal and WRQ."; (2) "WS-Security Authentication and Digital Signatures with Web Services Enhancements," by Matt Powell; (3) "Web Services Security Specification (WS-Security)."

  • [December 16, 2002] "Programming with Web Services Enhancements 1.0 for Microsoft .NET." By Tim Ewald (Microsoft Corporation). From Microsoft MSDN Library. December 2002. ['Tim Ewald examines the architecture of WSE and explains how to use it to build a simple, WSE-enabled ASP.NET Web service and client.'] "Web Services Enhancements 1.0 for Microsoft .NET, or WSE, is a new .NET class library for building Web services using the latest Web services protocols, including WS-Security, WS-Routing, DIME, and WS-Attachments. WSE integrates with ASP.NET Web services, offering a simple way to extend their functionality. This paper explores the architecture of WSE and explains how to use it to build a simple Web service... The design of WSE reflects the principles of the protocols themselves, as outlined in 'Understanding GXA': (1) Decentralization and federation; (2) Modularity; (3) XML-based data model; (4) Transport neutrality; (5) Application domain neutrality. WSE provides a very 'close to the metal' programming model focused on directly manipulating the protocol headers included in SOAP messages... At its heart, WSE is an engine for applying advanced Web service protocols to SOAP messages. This entails writing headers to outbound SOAP messages and reading headers from inbound SOAP messages. It may also require transforming the SOAP message body -- for instance, encrypting an outbound message's body and decrypting an inbound message's body, as defined by the WS-Security specification..." See also: (1) the announcement "Microsoft Delivers Latest Developer Tools for Building Advanced, More Secure Web Services. Web Services Enhancements 1.0 Available for Visual Studio .NET. Developers, With Support From Companies Such as F5 Networks, WestGlobal and WRQ."; (2) the main page Web Services Enhancements for Microsoft .NET; (3) "WS-Security Authentication and Digital Signatures with Web Services Enhancements," by Matt Powell; (4) Inside the Web Services Enhancements Pipeline.

  • [December 16, 2002] "A Question of Identity: Passport, Liberty, and the Single Sign-On Race." By Amit Asaravala. In New Architect Volume 8, Issue 01 (January 2003), pages 22-24. "... new options are emerging that lend SSO capabilities to independent businesses. Perhaps the largest public SSO network in existence is Microsoft's .Net Passport. Launched in 1999, it now boasts nearly ninety participating sites and claims to host 200 million user accounts... While Passport membership is free to end users, participating businesses must pay an annual fee of $10,000, plus a vaguely defined "compliance testing fee" of $1,500. According to Microsoft, the latter covers the cost of having an outside vendor verify a Passport implementation, and is usually -- though not always -- a one-time fee. From the developer's standpoint, a Passport subscription amounts to a license to use the Passport development libraries in a production environment. Subscribers need not use a Microsoft Web server or even a Microsoft operating system -- the libraries are available for Solaris and Linux systems running the Apache and iPlanet Web servers, in addition to Windows and IIS. In order to activate Passport, developers must go through their sites and add API calls to each Web page or resource that needs authentication. The alternative for companies not interested in joining Passport or waiting for Magic Carpet is the Liberty Alliance Project. Formed in September 2001, the Liberty Alliance is a consortium made up of 130 organizations from various industries, including such diverse companies as American Express, Bank of America, Hewlett-Packard, Sun Microsystems, and United Airlines. Unlike Microsoft, the Liberty Alliance isn't itself a software company. Rather than providing a service or creating a product, the consortium's goal is to define and maintain a standard to which SSO services and other identity management solutions should be built. For instance, AOL Time Warner joined the Alliance in December 2001, and has agreed to have Magic Carpet conform to the Liberty specification. Organizations can use the specification regardless of whether they are members of the Liberty Alliance, however. The Alliance released its Liberty 1.0 specification in July 2002. As an industry standard, Liberty will make integration with potential partners easier than it would be using proprietary solutions. For example, if two companies merged and the Web applications on their respective intranets both used Liberty for authentication and authorization, the total cost of merging the companies' infrastructures would be dramatically reduced. There are a number of competing enterprise-class SSO products on the market, like Computer Associates' eTrust SSO and Novell's SecureLogin... If you're an online retailer looking to join a large, established SSO network right away, Passport is the answer. If you can wait six months to a year, or you only need to offer SSO capabilities within a limited group of sites and applications, then Liberty-based solutions are definitely worth a look, particularly because of their full support for SAML. Finally, if you're planning to create Web services that need authentication and authorization capability, building to the WS-Security specification will help you plug into federated identity services at a later date. Of course, another option is simply to wait for de facto standards to emerge before deciding which technologies to adopt..." See: (1) "Liberty Alliance Specifications for Federated Network Identification and Authorization."; (2) Microsoft .NET Passport.

  • [December 16, 2002] "Instant Update: Making Your Data And Spreadsheets Web Viewable Through MVC." By Paul Sholtz. In New Architect Volume 8, Issue 01 (January 2003), pages 26-29. The Model-View-Controller (MVC) pattern is an established and well-understood software design method. It works well in the context of rich-client GUI applications that maintain persistent state. However, the Web is an entirely different medium than a rich client GUI, and porting MVC to the Web isn't straightforward. The goal in MVC is to separate the application data (the model) from the way the data is rendered to the user (the view) and from the way in which the user controls the data (the controller). Dividing the concerns of the application in this way dramatically increases the system's modularity, and enhances the overall flexibility of the design. Such flexible design is particularly useful for e-commerce... A number of frameworks are available to help developers architect MVC-compliant interfaces for their Web applications. Perhaps the best known is Struts, an open source project that is part of the Apache Jakarta initiative. The Struts package provides a unified set of reusable components for building user interfaces that can easily be adapted for any Web-based connection (e.g., HTTP requests, WAP, or even standard socket-level applications). Struts ships with a controller servlet, custom JSP tag libraries, and some utility classes. Struts has a relatively steep learning curve, but most programmers find it worth the effort. If, however, you're looking for something a little easier to manage so that you can get up and running a bit faster, Maverick might be the way to go. Maverick offers many of the same features as Struts, as well as some unique XSLT transformation features... Different software architectures are applicable in different contexts and use cases. If you're rapidly prototyping a proof of a concept, chances are that a simple Model 1 pattern [a JSP is responsible for both controller and view responsibilities] will be sufficient for your requirements. If, on the other hand, you are developing an application that you intend to roll into production and maintain for years, you should design with flexibility, extensibility, and modularity in mind. Software patterns like MVC and Model 2 [server-side implementation of MVC] provide a powerful design technique that will help ensure that your code remains in use for years to come..."

  • [December 16, 2002] "How Open is the New Office?" By Joe Wilcox. In CNET (December 16, 2002). "Microsoft says it's opening up its Office desktop software by adding support for XML -- a move that should help companies free up access to their shared information. But there's a catch: Microsoft has yet to disclose the underlying XML dialect that it's using. The software giant intends to make Extensible Markup Language (XML) a supported file format -- in addition to existing proprietary formats -- for its upcoming Office 11 desktop software, which is in the hands of about 12,000 beta testers. XML is a widely used standard for Web data exchange. With the Office 11 update, Microsoft is allowing files saved in the XML format to be viewable through any standard Web browser. That's a big change from the company's previous stance of using only proprietary file formats. But Office's XML support will allow larger companies to extract and use data from documents more efficiently, according to Microsoft... The software maker says it plans to disclose additional information on Office 11's XML schemas, possibly when the update ships next spring. Right now, a limited number of beta testers have access to some schema information. But it's unclear how complete the information Microsoft intends to release will be. Whether the company will disclose enough to allow interoperability with competing programs, and whether the schema information will be governed by licensing terms, are still unknown... Microsoft executives acknowledged that the company's XML support in Office is governed by a proprietary schema and that XML documents created by Office 11 applications may not be readable in a competing product... Microsoft executives acknowledged that the company's XML support in Office is governed by a proprietary schema and that XML documents created by Office 11 applications may not be readable in a competing product... That scenario leaves "Microsoft with the initiative," said Gartner analyst Wes Rischel. To add new features to an XML format, "somebody has to define that (the format). If they're the only ones that can define that, then for the other office products, the best they can do is keep up with Microsoft. They can't have any initiative of their own... The issue of Office's XML support has come into focus in recent months. Many large businesses, particularly those stung by recent Microsoft licensing price increases, have become increasingly aware of how dependent they are on Microsoft because of their use of proprietary Office file formats, said analysts. "The problem for many organizations is that there are years of institutional knowledge that are locked into the Office formats forever," said [Jupiter's Michael] Gartenberg..." See: "Microsoft Office 11 and XDocs."

  • [December 16, 2002] "Generic Namespace Dispatch Behavior for XML." By Mark Baker (Idokorro Mobile, Inc). Reference posted to XML-DEV. June 21, 2002 or later. Draft IETF document 'draft-baker-generic-xmlns-dispatch-00.txt' (which was not submitted): "...The time doesn't seem right, as the problem this addresses isn't yet recognized as a problem. Nor is it even clear that application/xml won't evolve in practice to make this problem moot (i.e., dispatch behaviour will be expected)..." Abstract: "To date, the promise of constructing compound XML documents using XML namespace declarations, and having the resultant document be processed as a seamless whole, has not been realized. This document defines rules for processors and content that should allow a significant degree of generic processing to occur for many compound documents. These rules are then bound to a new generic XML media type, 'application/xmld'... This document aims to achieve two things. First, to lay out rules for how namespace declarations in documents can be used to dispatch processors for processing that content. And second, to bind this behaviour to a new generic XML media type..." See also the thread on XML-DEV. [cache]

  • [December 16, 2002] "Intercommunication between XSD and ASN.1." By Paul Thorpe (OSS Nokalva, Inc). Abstract from the XML 2002 Conference presentation, Baltimore. December 2002. "The Joint ISO/IEC and ITU-T ASN.1 standards committee has produced a new encoding rule for ASN.1 called XML Encoding Rules (XER) and a canonical variant of this, CXER. This has been approved by the ITU-T as Recommendation X.693 and is currently undergoing FDIS balloting in ISO as the International Standard ISO/IEC 8825 4, with the ballot to be completed in November 2002. This International Standard specifies the rules for creating valid XML documents given any ASN.1 schema. However, the XML documents produced using XER or CXER do not provide access to some of the XML encoding facilities such as attributes and lists. The ASN.1 standards committee therefore began work on adding encoding instructions that could direct an XER encoder to vary the XML documents produced with an ASN.1 schema to provide any form of XML document that the designer might desire, including use of attributes and lists and arbitrary patterns of XML elements (such as are provided by RELAX NG). Along with these encoding instructions, the ASN.1 committee embarked on producing a new standard ITU-T Rec X.694 - ISO/IEC 8825-5, which defines the mapping from XSD to ASN.1. This enables an ASN.1 specification to define the same valid XML documents as the XSD specification, but with the advantage of providing for additional binary encodings and for mapping into C, C++ and Java datastructures. The ASN.1 September 2002 meeting in Paris finalized much of this work, and it is expected to be fully complete in the November 2002 meeting in Geneva..." See: (1) the 2002-12 announcement "OSS Nokalva Announces XML Support"; (2) Online XML Schema to ASN.1 translator(3); "ASN.1 Markup Language (AML)."

  • [December 11, 2002] "Delivery Context Overview for Device Independence." Edited by Roger Gimson (HP). W3C Working Draft 13-December-2002. First public working draft. Latest version URL: This document provides an overview of the role of delivery context in assisting device independent presentation for the Web. It surveys current techniques for conveying delivery context information, and identifies further developments that would enhance the ability to adapt content for different access mechanisms... The Device Independence Principles [DIP] document (W3C Working Draft 18-September-2001) set out a number of principles that can lead to greater device independence in delivering Web content and applications. The term delivery context was introduced in DIP to refer to the set of attributes that characterize the delivery environment. Among these attributes, the ones that are most relevant for achieving device independence are those that characterize the presentation capabilities of the access mechanism, the delivery capabilities of the network and the presentation preferences of the user. Delivery context information is typically used to provide an appropriate format, styling or other aspect of some web content that will make it suitable for the capabilities of a presentation device. The selection or adaptation required to achieve this may be performed by an origin server, by an intermediary in the delivery path, or by a user agent. From the point of view of device independence, the main concern is accurately reflecting the capabilities of the access mechanism and the presentation preferences of the user. Given appropriate information about the delivery context, the presentation of the delivered content can be selected or adapted to be functional on that device, or may be further customized to provide a better user experience (as defined in [DIP]). The possible adaptations that could be performed on the available content can only be determined once the delivery context information is known. In this document, techniques for representing, conveying and processing delivery context are considered. Existing technologies that address these needs are reviewed. Areas that need further work are highlighted... The W3C is proposing CC/PP as a candidate standard for representing delivery context information. The W3C Device Independence Working Group is currently chartered to continue work on two key further topics in this area, a protocol for CC/PP and a core vocabulary for device presentation capabilities..." See the W3C Device Independence Activity Statement, the DI Working Group Charter, and the 'www-di' mailing list archives.

  • [December 13, 2002] "XML Security Time Stamping Protocol." By Axelle Apvrille and Vincent Girier (Storage Technology European Operations, Toulouse, France). Presented at ISSE 2002 (Informations Security Solutions Europe, Paris Disneyland, October 2-4, 2002). "Existing time stamp protocols are based on an exchange between a client, sending a document hash, and an authority which signs submitted document with current time. This protects against content and date forgery. Unfortunately, the protocol relies upon ASN.1 whose encoding is difficult to manipulate -- an issue at which XML is very successful. Consequently, this paper proposes an XML adaptation of the time stamping protocol. It focuses on possible integration with XML Signatures and uses XML schemas to describe time stamping messages. Finally, this work provides the basis for wider XML developments of time stamping models... Designing a time stamp protocol benefits from XML's most evident advantages: readability and extensibility. Readability is necessary for a protocol designed to be widely used; it is achieved thaks to an output often consisting of explicit names instead of numbers or codes. Extensibility is even more critical as protocols evolve permanently. In this perspective, it is important that Time Stamp Protocol benefits from XML Signatures' evolutions as security is at stake. Moreover, we believe efficiency of the protocol is not sacrificed. In the 80's, ASN.1's compactness was an important featuer, but nowadays, the difference of compactness between ASN.1 and XMLnotations only introduces minimal additional processing. Actually, the efficiency of the overall process mainly depends on the TSA's ability to perform quickly computations like hashing and signing..." See: (1) Digital Time-Stamping; (2) OASIS Digital Signature Services Technical Committee; (3) "Digital Signatures." [alt URL 1, 2]

  • [December 11, 2002] "WS-Security Authentication and Digital Signatures with Web Services Enhancements." By Matt Powell (Microsoft Corporation). Microsoft MSDN Library. December 2002. ['Part of the documentation for Web Services Enhancements 1.0 for Microsoft .NET. Web Services Enhancements 1.0 for Microsoft .NET (WSE) provides advanced Web services functionality for Microsoft Visual Studio .NET and Microsoft .NET Framework developers to support the latest Web services capabilities. Enterprise ready applications can be developed quickly with the support of security features such as digital signature and encryption, message routing capabilities, and the ability to include message attachments that are not serialized into XML.'] "With the advent of the Web Services Enhancements 1.0 for Microsoft .NET (WSE), Microsoft has provided its first toolset for implementing security within a SOAP message. No longer are Web services tied strictly to using the security capabilities of the underlying transport. Now a SOAP message on its own can be authenticated, its integrity verified, and can even be encrypted all within the SOAP envelope using the mechanisms defined by the WS-Security specification. In this article, we will be looking at how you can use WSE to take advantage of WS-Security for authenticating and signing data for your Web services. Simply sending X.509 certificates with a request is actually not much of a way to authenticate anything. The certificates are considered public knowledge so anyone could include anyone else's certificate with their request. The mechanism for using certificates for authentication is based off the idea we mentioned earlier in regards to signing some entity with the private key that corresponds to the public key in the certificate... WSE supports creating digital signatures in a very straightforward manner... The WSE SOAP extension will validate the syntax of a digital signature, but simply knowing that a valid signature exists in a message is not enough to know that the message came from a particular individual. WS-Security and the XML Digital Signature specification, (Extensible Markup Language) XML-Signature Syntax and Processing, are very flexible in the way that they allow signatures to be included within XML... This article illustrates how Web Services Enhancements for Microsoft .NET offers you a glimpse into the capabilities of the WS-Security specification by looking at how you can use WSE to do UsernameToken authentication and by using X.509 authentication to digitally sign portions of the SOAP message. WSE lets you inspect and verify the kind of WS-Security support that is included with a SOAP message. There are a number of other WS-Security capabilities WSE provides that are not discussed here, such as using various forms of encryption..." See also Inside the Web Services Enhancements Pipeline," by Tim Ewald; the article explains "how individual filters and pipelines of filters work, how to configure the default pipeline, how to build custom filters, and where DIME fits into the picture." See: "Web Services Security Specification (WS-Security)."

  • [December 12, 2002] "Datatypes for XML Topic Maps (XTM): Using XML Schema Datatypes." By Murray Altheim (Knowledge Media Institute, The Open University, Milton Keynes, Bucks, UK). Draft version 1.4 2002/12/12. Topic map available in XML format, also zipped. Reference posted to the OASIS Topic Maps Published Subjects Technical Committee list. Abstract: "The W3C Recommendation XML Schema Part 2: Datatypes ('XSD') provides a specification of datatypes and their facets, and forms the semantic basis of this document, which establishes Published Subject Indicators (PSIs) for each XML Schema datatype and facet. A PSI is a (relatively) stable URL used as a canonical identifier for a subject, particularly within an XML Topic Map (XTM) document, though application of PSIs is not limited to XTM. This document does not alter any XML Schema datatype definition; for definitions of each datatype..." Author's note: "I'm pleased to announce a first draft of something that's been in the works for over a year... The ability to constrain or 'type' topic characteristics is something necessary within the topic map community. Constraints on topic map structures may be provided by various forms of schema facilities, such as TMCL, but a simple datatyping feature is still something sorely missing. There are some examples provided in the document... I welcome comments or suggestions towards establishing 'best practices' for use of these PSIs, as well as comments on the structures within the provided topic map. I am willing to add visualizations of various parts of the topic map if that is considered helpful." The purpose of the Topic Maps Published Subjects Technical Committee is "to promote the use of Published Subjects by specifying requirements, recommendations, and best practices, for their definition, management and use. Public Subject was defined by ISO 13250 Topic Maps standard and further refined as Published Subject in the XML Topic Maps (XTM) 1.0 Specification." See: (1) "(XML) Topic Maps."; (2) "XML Schemas."

  • [December 12, 2002] "Information Technology: A Quick-Reference List of Organizations and Standards for Digital Rights Management." Prepared by Gordon E. Lyon (Convergent Information Systems Division, Information Technology Laboratory, National Institute of Standards and Technology - NIST). NIST Special Publication 500-241. October 2002. 20 pages. "The field of digital rights management (DRM), sometimes called intellectual property management and protection (IPMP), is today a chaotic and not always workable mix of technology, policy, law and business practice. There are many organizations active in DRM. Under such circumstances, even a modest guide or index of active organizations can be useful. In March 2002, experts at a NIST cross-industry DRM workshop recommended that NIST take first steps toward such a guide. With the help of numerous workshop participants and others, this is the first edition of a DRM quick-reference list... The list has definite tradeoffs to preserve compactness and to limit maintenance. For example, the text contains many secondary-level acronyms that are left undefined -- a reader must resolve these terms via Web searching or similar external referencing. Additionally, entries are placed, dictionary style, in alphabetical order rather than under topics. Entries in the table largely constitute an attempt to assemble a short set of descriptions about organizations in DRM (who they are, what they are doing). The left side of each entry is a descriptor. The entry right side supplements by helping a reader explore further, usually at some Web site of an organization..." Note: 'To overcome barriers to usability, scalability, interoperability, and security in information systems and networks, the NIST Information Technology Laboratory programs focus on a broad range of networking, security, and advanced information technologies, as well as the mathematical, statistical, and computational sciences. This Special Publication 500-series reports on ITL's research in tests and test methods for information technology, and its collaborative activities with industry, government, and academic organizations.' General references in "XML and Digital Rights Management (DRM)." [cache]

  • [December 11, 2002] "Novell Rolls Out UDDI Server." By Cathleen Moore. In InfoWorld (December 11, 2002). "Hoping to kick-start adoption of the emerging UDDI (Universal Description, Discovery, and Integration) Web services specification, Novell on Wednesday introduced a UDDI server based on its eDirectory software. The offering is designed to add security and identity management capabilities to UDDI. The UDDI specification aims to provide a registry of available Web services and address issues such as how to manage identity for Web services applications. The Novell Nsure UDDI Server taps capabilities of the company's eDirectory technology to add directory-based features such as access control and secure identity management to UDDI services, according to Justin Taylor, chief strategist for directory services at Novell, based in Provo, Utah. 'Instead of just simply storing things in a database, which is where the traditional UDDI deployments are today, adding UDDI support to our directory service [contributes] access control and identity management they need to manage these applications,' Taylor said... In addition, the UDDI server attempts to resolve some elements not addressed in the UDDI specification, including how to secure data in the UDDI server, how to securely handle identity information, and how to add fault tolerance and back-up capabilities... Early on UDDI was considered a key element of Web services architectures currently under development, but adoption of UDDI has been slow due primarily to the lack of need for large-scale registry services, according to Earl Perkins, analyst at Meta Group in Stamford, Conn..." See: (1) the announcement "Novell Adds Secure Identity Management to Key Web Services Standard"; (2) general references in "Universal Description, Discovery, and Integration (UDDI)."

  • [December 11, 2002] "IBM, BEA Take Royalty-Free Stance on BPEL4WS." By Paul Krill. In InfoWorld (December 11, 2002). "Removing a potential barrier to Web services choreography standards efforts, an IBM official on Wednesday pledged that the vendor will not seek royalties for its contributions to the BPEL4WS (Business Process Execution Language for Web Services) proposal. BEA Systems, a co-author of BPEL4WS, released a statement with a similar pledge shortly thereafter. Still remaining to be heard from is Microsoft, also a BPEL4WS co-author, pertaining to its stance on royalties related to the proposal. Bob Sutor, director of Web services strategy at Armonk, N.Y.-based IBM, said he could not speak for the positions of the two co-authors of BPEL4WS pertaining to royalties. 'As far as IBM is concerned, we will license BPEL4WS on a royalty-free basis,' Sutor said during a presentation at the CNet Networks 'Building a Web Services Foundation' conference... BEA released a statement that it would submit its technologies for BPEL4WS sans royalties if the other two vendors do as well. 'BEA has been on record for a long time now advocating that all standards be royalty-free and have committed that any standard we have control over will be offered royalty-free. However, BPEL4WS is a standard backed by three companies, so all three have to agree on the royalty-free position. BEA has long pushed for it to be royalty-free -- so if IBM and Microsoft also come on board agreeing with that position, then we will of course continue to advocate it be submitted as a royalty-free standard,' said San Jose, Calif.-based BEA. Sutor cautioned that observers should not become simplistic in regards to royalty-free proposals in general. IBM, for example, could have conditions such as requiring that other vendors provide their technology royalty-free if IBM does. Web services choreography standardization pertains to defining industrywide mechanisms and formats for interaction among Web services. An example where this would come into play would be an e-business transaction that requires multiple Web services to interact on matters such as filling orders, checking inventory, and performing credit checks, according to Sutor. Choreography is considered a major factor in furthering use of Web services, and it is the subject of both BPEL4WS, which has yet to be submitted to an industry standards organization, and Web Services Choreography Interface, a Sun Microsystems-led proposal already under consideration by the World Wide Web Consortium (W3C). Although the BPEL4WS authors plan to submit the plan to a standards organization, no decision has yet been made on which one, and that is not expected for a month or two, Sutor said..." See: (1) "Business Process Execution Language for Web Services (BPEL4WS)."; (2) "Patents and Open Standards."

  • [December 11, 2002] "IBM Backs Royalty Free Approach on Web Services Choreography." By Gavin Clarke. In Computer Business Review Online (December 12, 2002). "IBM signaled a partial stand-down in an emerging cold war over royalties for web services yesterday, saying it would not seek payment for its contributions to BPEL4WS. IBM's director of e-business standards Bob Sutor told a web services conference in San Francisco, California, he believes IBM will license Business Process Execution Language for Web Services (BPEL4WS) on a royalty free basis. Sutor's position will come as a relief to vendors, standards bodies and customers who are concerned that organizations and individuals who contribute their technologies to emerging web services standards might charge royalties at a later date. IBM is among those causing most concern. The company, a patent powerhouse, has been instrumental in driving a number of specifications increasingly regarded as fundamental to web services, such as Simple Object Access Protocol (SOAP). Steve Holbrook, IBM program director for emerging e-business standards, recently told ComputerWire IBM would not charge royalties if vendors with similar intellectual property (IP) claims on technologies followed suit. BPEL4WS was developed by IBM working with Redmond, Washington-based Microsoft Corp and San Jose, California-based BEA Systems Inc. BEA yesterday released a statement saying it would continue to advocate BEPL4WS be submitted as a royalty-free standard if IBM and Microsoft agreed with that position... BPEL4WS defines mechanisms and formats for interaction between web services. Santa Clara, California-based Sun Microsystems Inc has published its own alternative, Web Services Choreography Interface (WSCI) with Waldorf, Germany-based SAP AG..." A related article "IBM Backs Royalty Free Approach on Web Services Choreography" was published in The Register. See (1) "Web Service Choreography Interface (WSCI)"; (2) "Business Process Execution Language for Web Services (BPEL4WS)."

  • [December 11, 2002] "The .NET Schema Object Model." By Priya Lakshminarayanan. From December 04, 2002. ['A detailed exposition of Microsoft's .NET Schema Object Model (SOM). The .NET SOM is an API for programmatic manipulation of W3C XML Schemas. Priya Lakshminarayanan provides plenty of example code in his explanation of the SOM, and shows how to create, navigate, and amend schemas.'] "Despite the many articles explaining W3C XML Schema (WXS), it's not enough to discuss WXS as a specification only. Educational materials should also discuss tools which aid the development of XML applications which employ WXS. This article focuses on an API in the .NET platform, the XML Schema Object Model (SOM). SOM is a rich API which allows developers to create, edit, and validate schemas programmatically -- one of the few such tools available so far. SOM operates on schema documents analogously to the way DOM operates on XML documents. Schema documents are valid XML files that, once loaded into the SOM, convey meaning about the structure and validity of other XML documents which conform to the schema. SOM is indispensable for a certain class of application, like a schema editor, where it needs to construct the schema in memory and check the schema's validity according to the WXS specification. The SOM comprises an extensive set of classes corresponding to the elements in a schema. For example, the <xsd:schema> ... </xsd:schema> element maps to the XmlSchema class -- all schema information that can possibly be contained within those tags can be represented using the XmlSchema class. Similarly <xsd:element> ... </xsd:element> maps to XmlSchemaElement, <xsd:attribute> ... </xsd:attribute> maps to XmlSchemaAttribute and so on. This mapping helps easy use of the API..." See related resources: (1) .NET Framework Developer's Guide. XML Schema Object Model (SOM); (2) Programming .NET Web Services, by Alex Ferrara, and Matthew MacDonald (September 2002; ISBN: 0-596-00250-5).

  • [December 11, 2002] "Using XSLT to Assist Regression Testing." By Sal Mangano. From December 04, 2002. ['In "Using XSLT to Assist Regression Testing," Sal Mangano, author of O'Reilly's XSLT Cookbook, develops techniques for regression testing of programs that output XML. Sal reaches a solution via an interesting exploration of various methods of normalizing XML documents.'] "Regression testing is an important software-testing technique in which the output of a program before a change was made is compared to the output after the change in order to determine whether the change introduced bugs. This sort of testing is useful after refactoring code to improve its structure or performance, but it does not alter its behavior. It is also useful when new features are added to software, as a way to test whether or not the old features have been affected. Recently, colleagues of mine asked if I knew of a tool that could help them regression-test some code which outputs XML. The problem was, they explained, that their changes may affect the order of elements. However, for their application, the order did not matter as long as the hierarchical structure and element content remained the same..."

  • [December 11, 2002] "Normalizing XML, Part 2." By Will Provost. From December 04, 2002. ['The second and final part of Will Provost's investigation into the application of relational database normalization techniques to XML schema design. Will looks at some subtle issues of XML data design, and completes his examination of relational normal forms and their applicability to XML.'] "... we resume our discussion of the applicability of concepts of data normalization to XML document design. In part one of this article, we observed that while XML's hierarchical model is somewhat at odds with the rectangular structure of relational data, the goals of data normalization as stated in the relational are certainly worthwhile. We've also seen the usefulness of the normal forms of relational theory -- perhaps not applied literally, but posed as challenges to find equally strict guidelines for XML design. The basic trick of RDB normalization -- the foreign-key relationship -- has been duplicated for W3C XML Schema (WXS) using the key and keyref components to avoid in a model. Everything's just humming along. But nothing is simple. In this second and final part, we'll look at some of the subtler issues of 'normalized' XML data design, as well as complete our run through the normal forms to see how well they apply to XML...if fourth normal form is irrelevant, can we safely ignore fifth normal form? Not really. Where fourth normal form concerns unrelated multivalued facts, fifth normal form addresses the issue of how to handle multivalued facts that are related by some additional rule. Where there are cycles of relationships between more than two record types, fifth normal form enforces complete decomposition of those relationships. The spirit of this rule certainly applies well to XML. Let's say we need to record information on musicians: what instruments they play and what styles of music they know. If instrument and style are independent, then this is a problem of the fourth normal form. But let's add the rule that only certain instruments are appropriate to certain musical styles. Do we list each pairing of instrument and style in a collection under a musician? This would be appropriate if the pairings were chosen by individual musicians, but if we're stating a general rule that excludes, say, rock and roll clarinet playing, then we should capture this in a separate tree (or matrix, really ) and keep the relationships to instrument and style independent under each musician. The benefits of fifth normal form in storage efficiency are a little harder to quantify for XML than for relational databases, but they are certainly there, as well. The broader point of fifth normal form, as with all the others, is to avoid redundant statements of fact, and that is as valid for XML as for relational data..."

  • [December 11, 2002] "SVG's Past and Promising Future." By Antoine Quint. From December 04, 2002. ['Antoine Quint takes an end-of-year look at SVG. Antoine notes that vendor support for SVG has broadened significantly, including the re-entry of Corel to the SVG game. He also looks at the upcoming SVG specifications: SVG Mobile, SVG 1.1, and SVG 1.2.'] "In this column I want to update the state of SVG by examining what's been happening lately. Among technical people around the world SVG awareness has risen dramatically over the last year. The SVG-Developers list has reached the 3,000 subscribers, while more specialized lists, ranging from job postings to pure coding, have popped up. Popular tech sites like Slashdot and have covered SVG. This was also the year of success for the first SVG conference, SVG Open, held in Z|rich last July; it attracted more than 200 visitors, presenters, and exhibitors. SVG has been discussed at various high-profile conferences, too: including JavaOne, XML 2001, XML Europe, SIGGRAPH, WWW10, and O'Reilly's Open Source Convention. Over the past year vendor support for SVG has significantly broadened. With the notable exception of Macromedia, most graphics vendors have shipped updated versions of popular products with new support for SVG. Corel followed Adobe's Illustrator 10 with the release of Draw 11, which has extended support for SVG (including import and export). SVG also appeared in more specialized packages like Toon Boom Studio (cartoon-oriented) and Swift 3D (3D-oriented). Other vendors haven't hesitated to make SVG the centerpiece of product offerings. PCX Software has pitched a suite of SVG-centric tools which are currently in development; some beta releases are available from their site... While SVG 1.1 is a great leap forward for the mobile market, it adds no new functionality. The W3C SVG Working Group released two weeks ago the first public draft of SVG 1.2, which introduces several anticipated enhancements. It's a very early draft, but it already shows the areas in which SVG is progressing. One of the most-requested new features for SVG is text wrapping... There are other significant things to highlight in SVG 1.2. Although not as detailed as text wrapping or DOM extensions, some are very anticipated by the SVG community. The WG is looking at more extensive integration with other XForms and SMIL, as well as ways to ease the rendering of arbitrary XML with SVG. Two other promising areas are streaming and modification of drawing order. SVG watching in 2003 should be as exciting as it was in 2002..." See: "W3C Scalable Vector Graphics (SVG)."

  • [December 11, 2002] "Feds Eye XML, Web Services." By Darryl K. Taft. In eWEEK (December 11, 2002). "The federal government is looking at XML and Web services as a key technology for integrating disparate systems and is developing an overall architecture to support interoperability based on XML. Robert Haycock, manager of the Office of Management and Budget's E-Gov Office, called XML and Web services the 'keys to the palace.' He said, 'We have so many legacy applications that Web services will help us to -- provide interoperability amongst and between these systems.' Haycock delivered a keynote speech at the XML 2002 Conference and Exposition here Tuesday... Haycock said the OMB is developing the Federal Enterprise Architecture (FEA), a business-based framework for governmentwide improvement and interoperability between systems in various government agencies. XML lies at the heart of the FEA, he said. The FEA is being developed through a series of five interrelated reference models designed for cross-agency integration of systems and reuse of technology and components. The FEA's Business Reference Model is out now, the Performance Reference Model will soon be out, and the three other reference models -- the Service Component Reference Model, the Technology Reference Model and the Data and Information Reference Model -- will be released by spring, Haycock said. The FEA will provide a business and technology framework for the 24 Presidential Management E-Gov initiatives and help align federal IT investments within the president's management agenda, Haycock said. Haycock called XML and Web services 'game changing' technologies that will facilitate horizontal and vertical information sharing and will provide an underlying framework for delivering services. Immediate uses for the FEA and the XML and Web services technology include integrating systems to support homeland security, disaster management, trade and financial management, among other applications, Haycock said..." See the Federal Enterprise Architecture website and the published reference models. The downloads page links to several XML resources, including (1) the FEA Business Reference Model (BRM) -- "This XML document contains the content of the Federal Enterprise Architecture relating to the Business Reference Model. It includes the Federal Business Areas, Lines of Business, Sub-Functions and their detailed descriptions. This document should be downloaded in conjunction with the FEA Business Reference Model (BRM) Schema"; (2) FEA Business Reference Model (BRM) XML Schema v1.0 -- "This document is used to describe and define the type of content including the entities, attributes, elements and notation of the Federal Enterprise Architecture"; (3) OMB 300 Submission XML Schema -- "This document is used to describe and define the type of content including the entities, attributes, elements and notation used for the submission of Exhibit 300s..."

  • [December 11, 2002] "Corel XMetal 4 Adds ActiveX Support." By Darryl K. Taft. In eWEEK (December 10, 2002). "Corel Corp. highlighted the XML 2002 Conference and Exposition on Tuesday [2002-12-10] with a couple of announcements aimed at souping up XML content creation and simplifying the management and deployment of XML applications. The Ottawa company announced XMetal 4, a new version of its XML authoring solution that adds support for Microsoft's ActiveX technology. Corel officials said the new product is targeted at both developers and users, and is made up of four separate components: Corel XMetaL Author, Corel XMetaL for ActiveX, Corel XMetaL Developer, and Corel XMetaL Central. Corel XMetal 4 features integration capabilities with content management solutions and supports content re-use and distribution to the Web, print, and mobile devices, the company said. Corel also announced Corel Smart Graphics Studio, a development platform for building Scalar Vector Graphics (SVG)-based smart graphics, which Corel officials said could change old XML and legacy data into dynamic graphics for use on a variety of platforms and devices. The product will be available mid-2003..." Notes also on Veridocs Corp.'s XMLdocs technology and the VorteXML Server from Datawatch Corp. See: (1) 'Corel Smart Graphics Solutions White Paper'; (2) the announcement "Corel to Preview New Tools for Smart Content Creation at XML 2002. Company to Unveil New Technologies for Creating XML and SVG Applications."; (3) Corel Smart Graphics; (4) Corel XMetaL 4.

  • [December 11, 2002] "XMPP Core." By Jeremie Miller [WWW] and Peter Saint-Andre [WWW] (Jabber Software Foundation). IETF Network Working Group, Internet-Draft. December 06, 2002, expires June 6, 2003. Reference: 'draft-ietf-xmpp-core-00'. Relevant XML DTDs/Schemas are presented in appendices. "This document describes the core features of the eXtensible Messaging and Presence Protocol (XMPP), which is used by the servers, clients, and other applications that comprise the Jabber network... The eXtensible Messaging and Presence Protocol (XMPP) is an open XML protocol for near-real-time messaging and presence. The protocol was developed originally within the Jabber community starting in 1998, and since 2001 has continued to evolve under the auspices of the Jabber Software Foundation and now the XMPP WG. Currently, there exist multiple implementations of the protocol, mostly offered under the name of Jabber. In addition, there are countless deployments of these implementations, which provide instant messaging (IM) and presence services at and among thousands of domains to a user base that is estimated at over one million end users. The current document defines the core constituents of XMPP; XMPP IM defines the extensions necessary to provide basic instant messaging and presence functionality that addresses the requirements defined in RFC 2779 ['A Model for Presence and Instant Messaging']. General references in "Extensible Messaging and Presence Protocol (XMPP)." [cache]

  • [December 11, 2002] "XMPP Instant Messaging." By Jeremie Miller [WWW] and Peter Saint-Andre [WWW] (Jabber Software Foundation). IETF Network Working Group, Internet-Draft. December 06, 2002, expires June 6, 2003. Reference: 'draft-ietf-xmpp-im-00'. Appendices present 4 XML DTDs/Schemas. "This document describes the specific extensions to and applications of the eXtensible Messaging and Presence Protocol (XMPP) that are necessary to create a basic instant messaging and presence application... The core features of the XMPP protocol are defined in XMPP Core. These features, specifically XML streams and the 'jabber:client' and 'jabber:server' namespaces, provide the building blocks for many types of near-real-time applications, which may be layered on top of the core by sending XML stanzas that are scoped by specific XML namespaces. This document describes the specific extensions to and applications of XMPP Core that are used to create the basic functionality expected of an instant messaging and presence application as defined in RFC 2779. Extended namespaces for many other functionality areas have been defined and continue to be defined by the Jabber Software Foundation, including service discovery, multi-user chat, search, remote procedure calls, data gathering and forms submission, encryption, feature negotiation, message composing events, message expiration, delayed delivery, and file transfer; however, such functionality is not described herein because it is not required by RFC 2779..." General references in "Extensible Messaging and Presence Protocol (XMPP)." [cache]

  • [December 11, 2002] "JEP-0054: vcard-temp." By Peter Saint-Andre [WWW] (Jabber Software Foundation). JEP Number: 0054. Version 0.2. 2002-11-06. ['Informational documentation of the vCard-XML format currently in use within the Jabber community.'] A 2002-10-03 communiqué from Peter Saint-Andre references this informational Jabber Enhancement Proposal (JEP) that "more fully and accurately describes the vCard XML format used in the Jabber community. This document, numbered JEP-0054, is now the canonical description of the Jabber usage." From the introduction: "This JEP documents the vCard-XML format currently in use within the Jabber community. A future JEP will recommend a standards-track protocol to supersede this informational document. The basic functionality is for a user to store and retrieve an XML representation of his or her vCard using the data storage capabilities native to all existing Jabber server implementations. This is done by by sending an <iq/> of type "set" (storage) or "get" (retrieval) to one's Jabber server containing a <vCard/> child scoped by the 'vcard-temp' namespace, with the <vCard/> element containing the actual vCard-XML elements as defined by the vCard-XML DTD. Other users may then view one's vCard information. vCards are an existing and widely-used standard for personal user information storage, somewhat like an electronic business card. The vCard format is defined in RFC 2426. In 1998, Frank Dawson submitted several Internet-Drafts proposing to represent the standard vCard format in XML. When the Jabber project was originally looking for a method to store personal user information, the only available XML representation of vCards was version 2 of Frank Dawson's Internet-Draft. He also submitted a third Internet-Draft on November 15, 1998. Unfortunately, neither proposal moved forward within the IETF. The Jabber project continued to use the version 2 proposal and DTD, and appeared to be the only users thereof. Because no similar DTD arose, the Jabber project continued to use the version 2 DTD, making two small modifications to adapt it for use within Jabber. This format was then implemented under the 'vcard-temp' namespace..." See: "vCARD in XML and RDF." [cache/snapshot]

  • [December 10, 2002] "Too Many Web Services Standards Bodies?" By Paul Krill. In InfoWorld (December 10, 2002). "Representatives from Web services standardization bodies at a conference here Tuesday [Building a Web Services Foundation, December 10 - 11, 2002. Hotel Nikko, San Francisco, California] pondered the notion of whether there are, in fact, too many of these groups and whether it might be a good idea to consolidate efforts in one organization. Participating were officials from OASIS (Organization for the Advancement of Structured Information Standards), W3C (World Wide Web Consortium), WS-I (Web Services Interoperability Organization), and Liberty Alliance. The officials sat on the panel at a CNET Networks conference... 'It would be ideal if we had a single organization dealing with Web services standards, but I think it's a practical impossibility,' said Tom Glover, president and chairman of WS-I. Different organizations focus on different problems, he stressed... OASIS President and CEO Patrick Gannon echoed his sentiments, arguing that there is not one organization that governs every aspect of development of any single set of standards. 'I think what we have here is an opportunity for cooperation among different organizations,' he said... W3C's representative, Michael Sperberg-McQueen, architecture domain leader with the organization, said that W3C provides a neutral meeting ground for the Web community to find consensus and prevent the Web from dissolving into mutually non-interoperable sub-Webs... Liberty Alliance's Michael Barrett, meanwhile, said Web services enable coupling and interoperability of services developed over the Internet. He acknowledged that the Liberty Alliance's network identity mission is less Web services-centric than the other bodies represented on the panel. On the issue of intellectual property, panelists concurred that their organizations seek to include technologies in standards or specifications on a royalty-free basis, so as to not encumber the specification with obligations of licensing fees or royalties to particular vendors. 'My personal view is I wish Congress would reform [the law],' so that software cannot be patented, Barrett said. 'I personally believe it's not helpful to the industry' to patent software, he said. In another session pertaining to WS-I at the conference, Edward Cobb, a member of the WS-I board of directors and vice president of architecture and standards at BEA Systems, said security would be a focus of WS-I efforts going forward, as the group looks to provide guidance to standards bodies..."

  • [December 10, 2002] "VeriSign Unveils New Online Identity-Verification Services." By Todd R. Weiss. In Computerworld (December 10, 2002). "Trust services vendor VeriSign Inc. today unveiled a new online identity verification service that allows Web customers to positively establish their identities with online merchants. The new Consumer Authentication Service (CAS) provides online businesses with instant, round-the-clock information taken from more than 50 public and private databases to ensure a customer's identity, said Anil Chakravarthy, director of product management for VeriSign's enterprise group. The information comes from state motor vehicle department, government, credit bureau and telephone number databases, Chakravarthy said. Optimized scoring models are used to help businesses quickly and easily determine whether potential buyers are who they claim to be... The new service uses a predefined set of XML standards to connect to an enterprise's customer-facing Web application. The authentication data entered by the consumer is then automatically routed using XML and encryption through VeriSign's services and is checked against varied databases to cross-verify and risk-rank the consumer identity in real time. Verification is then automatically sent back to the enterprise application and to the consumer, using underlying XML data..." See the press release: "VeriSign Announces General Availability of Online Consumer Identity Verification Service. XML-Based Web Service Enables Enterprises to Conduct Real-Time, Identity Verification for Risk Management and Fraud Prevention."

  • [December 10, 2002] "VeriSign Draws Road Map For Web Services." By Brian Fonseca. In InfoWorld (December 10, 2002). "Verisign drew up a road map of its Web services plans and predictions on Tuesday backed behind the debut of its two new Web services offerings. At the same time, company officials cautioned that quantifiable and positive results of Web services deployments will be out of reach until necessary integration and trusted security measures are put in place. Verisign chairman, president, and CEO Stratton Scavlos said IT budgets should start loosening in 2003 to hammer out integration difficulties which plague Web services adoption. To illustrate his position, he pointed to such examples as the fierce platform battles surrounding Web services architecture and application development tools, active standards efforts, heavy vertical industry and U.S. government interest, and the flexibility that XML provides... To address that problem and tackle lingering security questions, VeriSign on Tuesday [2002-12-10] introduced its Open Source WS-Security Implementation and Integration Toolkit. Designed for assisting developers to incorporate encryption and digital signatures for Web services, the toolkit marks an accelerated push by VeriSign to re-invent its full range of services to be Web services front-ended and delivered in a nonhuman-intervened fashion. That initiative will be driven by a new platform being developed by VeriSign called a 'Trust Gateway.' The product will be integrated and configuration-ready with all major application server platforms and match compliance with multiple Web services standards efforts... In addition to its Open Source Toolkit, VeriSign announced the availability of its Online Consumer Authentication Service. The service aims to provide customers with automated and managed access to authentication data through an application using pre-defined XML standards and encryption that cross-references and risk-ranks consumers' identity..." See: (1) the announcement: "VeriSign Offers Open Source WS-Security Implementation and Integration Toolkit to Help Developers Integrate Security Into Web Services. Effort Continues VeriSign's Commitment to Driving Trusted Web Services With Royalty-Free Implementation."; (2) "VeriSign Announces General Availability of Online Consumer Identity Verification Service. XML-Based Web Service Enables Enterprises to Conduct Real-Time, Identity Verification for Risk Management and Fraud Prevention."; (3) "Web Services Security Specification (WS-Security)."

  • [December 10, 2002] "XML Encryption Specs Approved." By Jim Hu. In CNET (December 10, 2002). "The Web's leading standards group approved on Tuesday two XML encryption specifications, a move that promises a boost for the development of secure Web services. The two specs, XML Encryption Syntax and Processing and Decryption Transform for XML Signature, will enable Web pages using Extensible Markup Language to encrypt parts of a document being exchanged between Web sites, the World Wide Web Consortium said. While other methods exist for encrypting XML documents, the W3C's specifications make it possible to encrypt selected sections or elements of a document--for instance, a credit card number entered in an XML form. 'This provides a way to identify parts of an XML document that may be secured by the author, so you can choose the parts that are most important and encrypt those,' said W3C representative Janet Daly. The new specifications are expected to help speed the development of Web services built on XML, code that lets developers create specialized languages for exchanging specific types of data..." See details in the 2002-12-10 news item "XML Encryption and Decryption Specifications Published as W3C Recommendations."

  • [December 10, 2002] "Intel Conducts $5B in Transactions Via RosettaNet." By Tom Krazit. In InfoWorld (December 10, 2002). "Intel is forging ahead with making purchases and selling its products through the RosettaNet Internet exchange standard. More than 10 percent of its customer and supplier transactions were executed through RosettaNet in 2002, a total of about $5 billion, Intel said Tuesday. RosettaNet is a standard based on XML (Extensible Markup Language) for business-to-business e-commerce transactions within the semiconductor industry. It allows a company's suppliers and customers to electronically buy and sell products without the need to set up EDI (electronic data interchange) links or depend on paper orders, said Chris Thomas, chief strategist for Intel's solutions markets development group. By using RosettaNet, Intel can receive electronic orders from smaller suppliers that previously couldn't afford to set up EDI links, Thomas said. Over 90 companies are now trading with Intel through RosettaNet, which resulted in 10 times the number of customer orders and five times the number of supplier transactions in 2002 as compared to 2001, he said... RosettaNet reduces the time to process orders from a day for paper orders to less than an hour, Thomas said. Intel was an original member of the RosettaNet consortium, which was formed in 1998. Over the last five years the consortium has slowly grown from its first connection between Intel and Arrow Electronics, an electronic component manufacturer, to the 90 partner companies currently connected. About 30,000 transactions per month are now executed over the RosettaNet standard. Other large industry consortiums trying out similar processes include CIDX, a chemical industry consortium, and UCCnet in the retail industry, Derome said. RosettaNet is in its early stages, and the EDI standard remains the choice of most businesses for electronic transactions between suppliers and producers, Derome said. But RosettaNet is the pre-eminent standard for XML-based electronic business, he said..." See: (1) "Intel Conducts $5 Billion in RosettaNet e-Business, Web Services. e-Business Technology Provides Productivity Gains and Faster Supply Chain Throughput."; (2) general references in "RosettaNet."

  • [December 10, 2002] "Intel To Use RosettaNet Standards. Chipmaker Aims to Squeeze $500 Million Out of Its Supply Chain." By Larry Greenemeier. In InformationWeek (December 09, 2002). "Intel is doing more for E-business than simply making faster processors. The chipmaker also is a big proponent of business-to-business integration and revealed Tuesday that it will process $5 billion (10%) of its revenue and supplier purchases using RosettaNet E-business technology standards. RosettaNet is an XML-based standard introduced in 1998 that lets supply-chain partners automate IT system interactions responsible for collaborative demand forecasting, order management, shipping and receiving logistics, invoicing, and payments. Intel says its long-term goal is to squeeze as much as $500 million in annual costs out of its supply chain, although the company hasn't publicly set a deadline for this. Much of these savings will come from eliminating the use of electronic data interchange to communicate with suppliers and customers, although this could take as long as a decade, says Chris Thomas, Intel's chief Web services strategist... Intel positions RosettaNet as a favorable alternative [to EDI] because it's a standard that multiple companies can adopt, eliminating the need for individual customer interfaces. Thomas says companies are missing out on cost savings if they wait for widespread adoption of Web-services standards Soap and UDDI rather than using XML today. He noted that all 90 of Intel's trading partners in 17 countries are using RosettaNet standards. Says Thomas, 'XML is a key to getting started with Web services'..." See the 2002-12-10 announcement: "Intel Conducts $5 Billion in RosettaNet e-Business, Web Services. e-Business Technology Provides Productivity Gains and Faster Supply Chain Throughput." General references in "RosettaNet."

  • [December 10, 2002] "OASIS Tackles XML for Taxes." By Paul Krill. In InfoWorld (December 10, 2002). "A new "OASIS Tax XML Technical Committee, featuring representatives from government agencies in Canada, Germany, the Netherlands, the United Kingdom, and the United States, plus businesses and financial institutions, will develop a common vocabulary for tax reporting and compliance information. The project will enable exchange of tax information across international boundaries based on XML, said Gregory Carson, director of electronic tax administration modernization for the U.S. Internal Revenue Service, in Washington. For example, income in XML would mean the same in different countries, he said. The work of other efforts, such as ebXML (electronic business XML) and XBRL (Extensible Business Reporting Language), will be leveraged by OASIS, he said. Most data interchange related to taxes currently is done via EDI, but the IRS plans to develop using XML, Carson said. 'It will probably be a slow transition from the EDI format to XML,' as most businesses, at least in the United States, have invested in EDI infrastructures, Carson said. 'We're going to keep [EDI] up and running so we don't pull the rug out from underneath [businesses using EDI],' Carson said. 'But as the IRS modernizes with new e-filing offerings, we want to do that in XML so we have that flexibility without the overhead of EDI infrastructure'..." See details in: (1) 2002-11-11 news item "OASIS Members Form Tax XML Technical Committee"; (2) 2002-12-09 announcement "OASIS Members Form Tax XML Technical Committee"; (3) general references in "XML Markup Languages for Tax Information."

  • [December 10, 2002] "The Standard Application Model for Topic Maps." By Lars Marius Garshol and Graham Moore (JTC1/SC34). With cover page. Document reference: ISO/IEC JTC 1/SC34 N0356. Editor's draft for review and comment. Date: 2002-12-04. Produced for ISO 13250 (Project editors: Steven R. Newcomb, Michel Biezunski, Martin Bryan), ISO/IEC JTC 1/SC34 Information Technology - Document Description and Processing Languages. ['The Standard Application Model for Topic Maps (SAM) is a formal data model for topic maps, which will be used to define the topic map syntaxes (XTM and HyTM), and also be the foundation for the query language (TMQL) and the schema language (TMCL).'] "This document defines the structure and interpretation of topic maps by defining the semantics of topic map constructs using prose, and the structure of such constructs using a semi-formal data model. Together with the Reference Model specification and the HyTM syntax specification this document will supersede ISO 13250 [ISO/IEC 13250:2002 Topic Maps, ISO, Geneva, 2002]. Together with the XTM syntax specification this document will supersede XTM [XML Topic Maps (XTM) 1.0 Specification, TopicMaps.Org, 2001]. It is intended to become part of the new ISO 13250 standard... This international standard requires that topic map implementations must have internal representations of topic maps that have a clearly documented correspondence to the model defined in this document. This international standard also defines a number of structural constraints and operations on instances of the model, to which its implementations must conform. The process of exporting topic maps from an implementation's internal representation to an instance of a topic map syntax is known as serialization. The opposite process, that of building such a representation from information encoded using a topic map syntax, is known as deserialization. Conforming specifications of topic map syntaxes must define these processes in terms of the model specified in this specification, although they are not required to define both operations. A syntax may be conforming even if it does not represent all the information in this model. A topic map processor is any module or system that can process topic maps in conformance with this standard. It is assumed that the topic map processor does its work on behalf of another module known as the topic map application. This international standard assumes that a topic map processor will do deserialization on behalf of the application, and that the processor will manage the topic maps on behalf of the application..." The SAM home page provides other references. For a TM document overview, see: (1) "Guide to the Topic Map Standardization"; (2) "ISO SC34 Publishes Draft Reference Model for Topic Maps (RM4TM). General references in "(XML) Topic Maps."

  • [December 09, 2002] "SAML Unlocks Door to Web Services." By Jim Rapoza. In eWEEK (December 09, 2002). "Early last month, a key element in using Web services for business applications reached a milestone when SAML 1.0 was released as a standard by the XML consortium OASIS, or Organization for the Advancement of Structured Information Standards. Security Assertion Markup Language, which is based on XML, provides a framework for authentication and authorization in Web services -- something that has been sorely missing. SAML also makes it possible to provide single-sign-on capabilities, one reason that it is a core technology behind the Liberty Alliance's ID management effort. Although not all security and access control applications may be up to the final standard specification, many already incorporate some form of SAML support. This isn't surprising, given that the SAML working group comprises representatives from most of the leading authentication vendors. However, even if your business isn't using one of these applications, incorporating SAML into your Web services is not difficult. eWeek Labs found the SAML specification to be simple and straightforward. If you can write an XML-based Web service, you can easily define authentication using SAML. In its most basic form, SAML associates an identity (such as an e-mail address or a directory listing) with a subject (such as a user or system) and defines the access rights for this, subject to a specific domain. One of the biggest strengths of SAML is how well it can interoperate with any kind of system. For example, when it comes to authentication, SAML supports almost everything, from passwords to hardware tokens to public keys to secure certificates. SAML also has built-in support for XML signatures, making it possible to handle not only authentication but also message integrity and nonrepudiation of the sender... SAML can handle single-sign-on capabilities because a SAML authentication authority can receive and send authentication assertions. This means that as a user authenticates and takes actions in a domain, the SAML authority is aware of past authorizations and assertions..." See: "Security Assertion Markup Language (SAML)."

  • [December 09, 2002] "Office 11 Gets Developer Tools." By Darryl K. Taft. In eWEEK (December 09, 2002). "Microsoft Corp. Monday announced a new set of Visual Studio-based tools to help developers build business solutions based on the next version of Microsoft Office, code-named Office 11... Known as Visual Studio Tools for Office, the new offering is a set of tools, frameworks, Office integration solutions and customer-assistance solutions from Microsoft. The technology includes classes and libraries for Visual Studio .Net languages -- such as Visual Basic .Net and Visual C++ .net -- and .Net Framework. Users will be able to build solutions based on Microsoft Word and Excel documents, as well as several new XML-focused best practices. Visual Tools for Office represents a convergence of Microsoft's developer tools with its Office programming model. Company officials said Visual Studio Tools for Office will be available with the release of Office 11, slated for mid-2003. Office 11 features enhanced XML support and also supports user-defined XML Schema Definitions (XSDs). XSDs enable users to structure their data to fit their business needs, allowing companies to create customized Office solutions in documents that interact with other XML-based Web services. The new tools also support the Extensible Stylesheet Language (XSL) and XPath..." See details in the announcement: "Microsoft Empowers Developers to Build Smart Business Solutions With 'Office 11'. Developers Gain Enhanced XML Support and New 'Visual Studio Tools for Office'."

  • [December 09, 2002] " Puts XML to Work At The Editor's Desk." By Rich Seeley. In Application Development Trends (December 04, 2002). "... While editors at USA Today still mark up stories, they now use the Extensible Markup Language (XML), the electronic descendent of the old pen and pencil marks that date back to editors in green eyeshades. Editors for the Web site are also beginning to work with software tools that scan stories and automatically compose headlines, write summaries of the content and list 10 keywords for search engines. The XML editorial system at, which rolled out this summer, began with a commitment to implementing XML technology for editing and publishing news stories, according to Chat Joglekar, information management architect for the project. The development project, aimed at producing faster, more intelligent news coverage, began a year-and-a-half ago, he added... Joglekar said he was able to buy two applications -- XMetal Editor from SoftQuad, now part of Corel Corp., Ottawa; and XML Categorizer and Concept Tagger ontology-based tools from Los Angeles-based Applied Semantics Inc. -- and plug them into the system. The editors at use XMetal to add XML tags to the wire stories that come into the system from Reuters, the Associated Press and other news services, Joglekar said. While there is an XML-based News Markup Language, called NewsML, created its own DTD that is 'loosely based' on it, but which incorporates document definitions specific to the way editors for the Web site work with stories, he added. 'It's basically our own XML format,' Joglekar said. 'It incorporates stuff we use for the meta data we want to capture, including the type of handling we do -- such as where it will end up on the site.' Applied Semantics' Categorizer and Concept Tagger tools are used to automate some of the work copy editors have traditionally done in news organizations. Based on an ontology of categories and terminology, the tools scan news stories for key terms and generate headlines, summaries and keyword lists. Applied Semantics maintains an ontology of millions of terms specifically geared to the news business, so organizations do not have to customize it..." See "NewsML."

  • [December 09, 2002] "Towards a Digital Rights Expression Language Standard for Learning Technology." Draft DREL White Paper for discussion. A Report of the IEEE Learning Technology Standards Committee Digital Rights Expression Language Study Group. Principal Authors: Norm Friesen, Magda Mourad, Robby Robson. Contributing Authors: Tom Barefoot, Chris Barlas, Kerry Blinco, Richard McCracken, Margaret Driscoll, Erik Duval, Brad Gandee, Susanne Guth, Renato Ianella, Guillermo Lao, Hiroshi Maruyama, Kiyoshi Nakabayashi, Harry Picarriello, Peter Schirling. 29 pages. Posted 2002-12-09 by Robby Robson (Eduworks) to the LTSC-DREL mailing list. "This report has been generated within the context of the Learning Technology Standards Committee (LTSC) of the Institute for Electrical and Electronic Engineers (IEEE). The LTSC develops accredited technical standards, recommended practices and guides for learning technology. 'Digital rights' is an area of vital importance to all industries that deal with digital content, including the industries of learning, education, and training. As a consequence the LTSC formed a study group in 2002 to examine digital rights standards and standards development efforts in light of applications to learning technology. This effort has focused on digital rights expression languages, i.e., languages in which rights can be expressed and communicated among cooperating technologies. Digital rights themselves exist as policy or law and are therefore not within the scope of a standards development organization. Technology is involved in enforcing digital rights, for example by disabling the ability to make unauthorized copies, but the LTSC almost exclusively deals with standards that support interoperability and not with implementation issues of this type. In this spirit the LTSC study group concentrated on making recommendations for standardizing a digital rights expression language with the specific charge to (1) Investigate existing standards development efforts for digital rights expression languages (DREL) and digital rights. (2) Gather DREL requirements germane to the learning, education, and training industries. (3) Make recommendations as to how to proceed. Possible outcomes included, a priori, recommending the adoption of an existing standard, recommending the creation of an application profile of an existing standard, or creating a new standard from scratch. (4) Feed requirements into ongoing DREL and digital rights standardization efforts, regardless of whether the LTSC decides to work with these efforts or embark on its own. This report represents the achievement of these goals in the form a of a white paper that can be used as reference for the LTSC, that makes recommendations concerning future work, and that can be shared with other organizations..." See background in LTSC DREL Project and general references in "XML and Digital Rights Management (DRM)." [source .DOC]

  • [December 09, 2002] "What's New for 'Office 11' Developers?" By Paul Cornell (Microsoft Corporation). From Microsoft MSDN Library (December 09, 2002). ['Paul Cornell delves into Office 11 Beta 1 and provides an overview, as well as examples and screenshots, of the many enhanced features for developers.'] "The next version of Microsoft Office, code-named Office 11, offers a wide array of solution development enhancements and options. Although Office 11 is not scheduled to release until mid-2003, I'll provide you with a first look of some possibilities for Office 11 solution developers... Office 11 provides lots of improvements over Office XP solution development, including: (1) Expanded XML support in Word 11 and Excel 11, including object model programmability and the use of customer-defined XML schemas. (2) A new Office 11 solutions model called smart documents. (3) Visual Studio Tools for Office. (4) Significant smart tag enhancements. (5) List technology. (6) Office 11 object model additions. (7) A new business-form oriented product code-named Microsoft XDocs, which includes a programmable object model... Word 11 supports XML file creation, storage, editing, and XSL data transformation. In Microsoft Word 2000 and 2002, documents saved in HTML format had some islands of XML data saved within them, but you could not use these products to natively create or save XML documents. In Word 11, you can save documents into several different XML document formats, including full-fidelity round-trippable Word documents conforming to the Microsoft Word XML document schema or other customer-defined XML schemas. You can even apply XSL transforms to XML data during save operations.. A new Office 11 family application, code named Microsoft XDocs, provides an intuitive, graphical user interface for filling out, reviewing, signing, and submitting rich, dynamic business forms in a flexible, XML-based format, without exposing the inner workings of XML to your end users. XDocs form developers can write code to process the submitted XML data behind the online business forms according to a business's needs. XDocs form developers can modify the sample XDocs business forms out of the box to fit their unique business needs, create and design new XDocs forms for end users, and modify existing XDocs XML Schemas for advanced form template creation and customization, in many cases using the XDocs graphical design environment. XDocs form developers can programmatically automate and extend XDocs forms through Web scripting code, authored in Microsoft Visual Basic Scripting Edition (VBScript) or Microsoft JScript, using the Microsoft Script Editor XDocs business forms are actually a collection of XML data files, XML Schema files, XSL Transformation files, and Web scripting files. An XDocs form template includes a form definition file, which is a manifest of all of the component files that make up an XDocs form template file..." See: "Microsoft Office 11 and XDocs."

  • [December 09, 2002] "What Is XSL-FO? When Should I Use It? [Tech Watch.]" By Stephen Deach (Adobe). In The Seybold Report Volume 2, Number 17 (December 9, 2002), pages 3-8. ISSN: 1533-9211. ['XSL-FO stands for eXtensible Stylesheet Language formatting objects. It's a language for completely describing a styled document. The second question "When Should I Use It?" is not so easy, but we can oversimplify by saying that layout- and design-intensive publications are probably not good candidates for the XSL-FO approach. Content-driven documents fit the model much better, especially if they must be personalized, produced in high volume or generated in response to Web queries.'] "Generically, XSL is actually a family of three Recommendations produced by the W3C's XSL Working Group: XSL Transformations (XSLT), XML Path Language (XPath), and eXtensible Stylesheet Language (the specific use of XSL). To ease the confusion over the specific and generic uses of XSL, most people refer to the last specification (which actually specifies the formatting objects) as XSL-FO... XSL-FO is an intermediate form between media-neutral XML and media-dependent output. You feed your XML structured content and an XSLT stylesheet to an XSLT processor. The result is XSL-FO. Then you feed the XSL-FO, along with font metrics and any external graphics, into an XSL-FO formatter. The result is a paginated document (in PDF or printer-native code) that can be displayed or printed... The requirements of the publishing industry are broad and diverse. Some documents require the creative professional to interactively modify the layout, the content and the typography to get an acceptable result; other documents are best generated and formatted on demand using rule-based systems. XSL-FO is designed to fill a hole in the format-on-demand sector of this industry; thus it is an ideal fit for batch-pagination products such as Adobe Document Server. The requirements of the format-on-demand sector are quite different from those addressed by interactive, WYSIWYG composition tools such as FrameMaker and InDesign. Today's WYSIWYG composition tools are very good at creating and perfecting individual documents, which is the primary thrust of much of the traditional publishing industry. In these WYSIWYG products, the content and styling and layout are tightly bound within the document, so reflowing the document to an alternate medium (display vs. paper, displays with differing sizes or form-factors) is difficult. Another area where WYSIWYG tools are not so good is producing thousands of documents whose content is pulled from a database and poured into templates. Examples include insurance policies, contracts, manuals, mail-order catalogs with regional or demographic variations, and textbooks and teaching aids with state-by-state or school-district-specific variations. In circumstances like these, the important issue is the ease with which whole collections of documents can be given a common look and feel. In content-driven print-production environments that have hierarchical styling models (such as those commonly used with SGML), users will have no trouble adapting to the use of XSL-FO. It is only necessary to learn XSL-FO's element names, attribute names and attribute-value names... XSL-FO has been a recommendation for only about a year. We are just starting to see the first tools come on the market that simplify its authoring. This is about the same rate of progress as with later versions of HTML, and is faster than we saw with CSS. It took about four years for CSS-1 to be fully adopted. As Web standards become more sophisticated, the adoption rates become slower. For example, the adoption of the majority of CSS-2 is progressing at a somewhat slower rate than CSS-1. Therefore, I expect the significant adoption of XSL-FO to develop over a three-to-five-year time frame..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."

  • [December 09, 2002] "Advent Boosts XML and Scripting In New 3B2. XSL-FO and SVG Support, Perl Scripts Within Commands, More Table Options." By George Alexander. In The Seybold Report Volume 2, Number 17 (December 9, 2002). ISSN: 1533-9211. "Advent has announced a new release (version 8) of its 3B2 publishing system software. The package will not be generally available until the second quarter of 2003, but a prerelease version was circulated to customers at a recent user gathering. The new version touches on all aspects of the package, but its focus is making 3B2 a better and more versatile platform for composing XML documents. A variety of specifically XML-based features have been added. These include a full, validating XML parser plus processors for Xpath (the XSL tree-processing and node-identification standard) and XSLT (the XSL transformation processing standard). In addition to a forthcoming 'free or nearly free' stand-alone XSL-FO rendering engine, support for XSL-FO is being provided within 3B2. There will also be support for SVG (the XML-based scalable vector graphics standard), to the extent that SVG graphics can be represented within the 3B2 graphics model. In anticipation of the requirements of an XML environment, a lot of scripting-related features have been added. Several of these involve support for Perl scripts. Scripts can be added to layout commands, permitting the arguments of the commands to be calculated by the scripts. Conversely, 3B2 expressions can be evaluated within Perl scripts using a new 'expr3b2' function. There are numerous improvements to the formatting and layout functionality of 3B2. Several of these are related to tables. For example, there are new ways to specify options (e.g., 90-degree rotation of the whole table) that should be tried when tables are too wide to fit the column measure. Tables can now be nested within tables. Widow-and-orphan control, previously available in running text but not in tables, has been added for tables... There are dozens of other enhancements, but you would have to be a 3B2 user to appreciate most of them. For example, there are several 3B2 functions that formerly used special characters and now been given names..."

  • [December 09, 2002] "Tridion's Strength in CM Basics Earns Success." By Luke Cavanagh. In The Seybold Report Volume 2, Number 17 (December 9, 2002). ISSN: 1533-9211. ['A strong XML story leads the way for Amsterdam vendor. Along with a useful 'blueprinting' approach to multilingual publishing and related versioning requirements, it has strong in-the-box functionality for standard operations'] "Through a commitment to standards and recognition of the need for multilingual Web publishing, Tridion has established a solid footing in the European content-management market, competing with the likes of Interwoven and Vignette -- North American vendors with a global presence -- as well as European vendors Mediasurface and Gauss... The current challenge for content-management vendors is not so much to invent ground-breaking features as it is to arrange the individual pieces of the content-management puzzle into a cohesive whole that strikes a balance between what's offered out of the box and what can be customized. With roots in publishing and attention to standards, particularly XML and those related to integration of third-party applications, Tridion presents a strong mix of framework and functionality in R5... Tridion R5 uses a database-driven approach running Oracle, Microsoft SQL Server, or (new in R5) IBM's DB2 underneath. Those that want to use Software AG's Tamino XML Server have the option to do so... Using XML to separate content from presentation and delivery, the system is well suited for multi-channel delivery. Tridion even says its foundation is suited for handling print if need be; however, few customers use the system this way and it lacks features such as publication or edition views that are usually necessary to spin print publications out of a content-management system. Tridion is planning to introduce a full 'print integration' module in 2003. Much of the static content that comes from the system -- and is stored in XML in the content repository -- can be authored through content input forms and poured into componentized page templates. R5 streamlines the set-up process for authoring by automatically building input forms, if the data structures are based on XML schemas. XML content can also be authored in Word or XMetaL using direct plug-ins to R5, and Tridion recently completed two-way integration with Macromedia's Dreamweaver MX...With an easy DTD editing interface and support for schemas, XSLT, Xlink and web services, R5's XML story is more advanced than many of its Web-centric, content-management counterparts. Because of its technical underpinnings, Tridion will be well-positioned to take a leadership position in cross-media publishing, once it completes the planned print integration module..."

  • [December 09, 2002] [W3C TAG] Draft of Position on SOAP's use of XML Internal Subset." By Noah Mendelsohn (IBM). Posting 2002-12-06 to W3C 'xml-dist-app' list, "a forum for discussion of XML in distributed applications, network protocols, and messaging systems," used by the W3C XML Protocol Working Group for its technical discussions. "A few days ago I was asked to draft a note that would explain to the TAG [Technical Architecture Group] and other concerned members of the W3C community some of the reasons behind SOAP's restrictions on the use of XML... The XML Protocols Workgroup appreciates this opportunity to clarify our design decisions regarding use of XML features such as the Internal Subset (for those not familiar with the term, 'Internal Subset' is the official term for a DTD that appears within an XML document..." See "Simple Object Access Protocol (SOAP)."

  • [December 09, 2002] "Navy Preps XML Policy. Policy Seeks to Drive Data Interoperability." By Matthew Frenck. In Federal Computer Week (December 09, 2002). "The Navy Department is finishing a policy that, for the first time, will set standards for the Navy's use of Extensible Markup Language as it attempts to put more of its applications and data online. Navy chief information officer David Wennergren said he expects to sign the final policy this week. The document, which has been widely circulated within the department, will set the standard for how XML will be used within the service so that XML-tagged data is fully interoperable servicewide. The policy will outline how the Navy will implement XML to better find, retrieve, process and exchange data. Navy officials said they want sailors and officers to be able to send out all information, anywhere, anytime, but this has been difficult, with disparate systems and applications spread throughout the service. XML facilitates information exchange among applications and systems because it enables agencies to tag data and documents... The overall goals of the policy are to promote XML as an enabling technology to help achieve enterprise interoperability Navy-wide and serve as a guideline to support interoperability between the Navy and other DOD components. Jacobs said five teams would have a different responsibility in determining the best way to begin implementing the policy [Five teams: strategic vision team, standard implementation team, enterprise implementation team, marketing and outreach team, and unnamed fifth team, which will work on integrating the acquisition and budgeting requirements]. However, a timeline has yet to be established for when XML deployment across the Navy would be completed. The Navy's XML standard, which also applies to the Marine Corps, already is receiving high marks from other government XML leaders. "I read their policy document and found it to be excellent and comprehensive -- the best I have seen in the federal government, or anywhere for that matter," said Brand Niemann, a computer scientist and XML Web services solutions architect with the Environmental Protection Agency. Niemann also heads the XML Web Services Working Group established by the CIO Council. Niemann said it's unusual for any agency as large as the Navy to have a comprehensive policy regarding XML deployment, and that other agencies should follow the Navy's lead..." See references in: (1) "DON XML WG XML Developer's Guide" [Version 1.1, from the DON US Department of Navy XML Working Group, edited by Brian Hopkins; May 2002. 97 pages]; (2) "U.S. Federal CIO Council XML Working Group Issues XML Developer's Guide"; (3) Navy XML Quickplace; (4) DON Interim Policy; (5) "Navy CALS Initiatives XML."

  • [December 09, 2002] "Schema Checklist for Navy XML Developer's Guide 1.1." DONXML Schema Checklist. By Mark Wolven (Mark Wolven). November 22, 2002. 3 pages. "The purpose of the DONXML Schema Checklist is to help developers create schemas that comply with Version 1.1 revised of the DON XML Developer's Guide. The guide contains more detail on the items noted in this checklist. Items in this checklist are designated according to their requirements level (MUST, SHOULD) and reference the appropriate section of the Developer's Guide. For more information about the efforts of the DON XML Work Group or to view the latest version of the Developer's Guide, visit the Department of the Navy XML Quickplace. For more information about DONXML Work Group, the Developer's Guide, or this checklist, please contact us at Available from the Navy XML Quickplace, Home > Library > Developer's Guide.

  • [December 09, 2002] "XML - an Enabling Technology to Achieve Interoperability." By Dan Porter (The Department of the Navy, Chief Information Officer). In CHIPS Magazine (Fall 2002). [US Department of the Navy Information Technology Magazine] "As we advance in the pursuit of our goals of Web enablement, the use of authoritative data sources, and information portals, XML will be a key component of our overall enterprise architecture and information technology strategy. The Department of the Navy Chief Information Officer (DON CIO) is committed to leading the Department in fully exploiting XML as an enabling technology to achieve interoperability in support of maritime information superiority. In both the public and private sectors, organizations have struggled for years to improve data sharing. Within the Department of Defense, efforts to standardize data have achieved varying levels of success. The ability to exchange data seamlessly, however, is a prerequisite to network-centric operations. To achieve our network-centric goals, the Department of the Navy must be proactive in addressing data exchange challenges. XML provides an alternative to previous data exchange techniques that have fallen short in achieving interoperability objectives. Although no single technology can solve interoperability problems, XML is an additional tool that DON developers and data architects can employ to achieve our goals... Prior to my tenure as the DON CIO, while serving in the capacity of DON Acquisition Reform Executive, one of my responsibilities was to lead the Department in the elimination of unique government-developed standards. The changing landscape of information technology demands that the government participate more actively in Voluntary Consensus Standards bodies. All of the current Internet standards, including XML, have been developed by such bodies. To shape our destiny, we must actively engage with these groups to ensure that DON requirements are met. The DON CIO recently joined one of these bodies - the Organization for the Advancement of Structured Information Standards (OASIS). With this membership DON personnel can participate in technical committees that are setting the standards for XML data exchange."

  • [December 09, 2002] "Promise and Potential: The Navy Works to Leverage XML's Power." By Jack Gribben (Research Fellow, Logistics Management Institute - LMI). In CHIPS Magazine (Fall 2002). "Over the past several years, the DON has launched multiple initiatives designed to increase interoperability at the enterprise level, including Data Management and Interoperability (DMI), Enterprise Resource Planning, and Task Force Web (TFWeb). It seemed that a Department-wide effort to aggressively implement XML, where appropriate, was a logical addition. Initially, however, it was not clear how the new technology would fit into the larger DON interoperability picture. When DMI was launched in 1999, XML was just beginning to emerge as an enabling technology for improving interoperability. While it would be referenced in DMI documents, XML was not truly tied into the data management processes that were being developed at the time as part of the initiative. The climate had changed, however, by the time TFWeb was initiated in April 2001. IT experts in the DON and elsewhere had developed a fuller understanding of XML and its potential. As part of TFWeb's charter to Web-enable all Department applications by 2004, planners identified XML as a primary component of the TFWeb technical architecture and directed that data exchange between Web-enabled applications and the DON Enterprise Portal use XML. With a healthy assist from TFWeb, hundreds of XML efforts are underway today within the DON. In addition to applications of the technology like the Coronado demonstration, XML is being used in other, less visible ways to improve interoperability and enhance the DON's ability to carry out its warfighter mission... At the Space and Naval Warfare Systems Command (SPAWAR), program managers and directorates under the Horizontal Integration Initiative are using XML to help improve Fleet operational readiness. By developing common XML vocabularies, SPAWAR is linking technical and training data for the IT-21 LAN, the suite of IT systems SPAWAR fields aboard Navy warships for command and control operations... XML is also being integrated into the DON's business-to-business operations. The Naval Supply Systems Command (NAVSUP) headquartered in Mechanicsburg, Pa., is working with XML to improve the processes DON employees use to access critical supply data. NAVSUP's current supply software system, One Touch, uses Java business logic to connect to a large number of mainframe and non-mainframe supply databases that provide users supply check... XML's extensibility, which is helping SPAWAR, NAVSUP, and other DON organizations come up with interoperability solutions to information problems, also presents a challenge to Department leaders. Because individual developers have the discretion to create the various XML tags, namespaces, and schemas, the Department must establish an XML governance strategy with a defined set of business rules so that XML vocabularies and their meanings are easily understood across the DON... To assist developers, the Work Group's Standard Implementation Team released the XML Developer's Guide (Version 1.1) in May 2002. The guide provides developers, and program managers, information about the use of XML specifications, XML component selection/creation, XML schema design, and XML component naming conventions. Available on the Work Group Web site, the guide is to be a "living document" that will continue to evolve and change as the XML technology matures..."

  • [December 07, 2002] "XML's One-Two Punch." By Bill Trippe (New Millennium Publishing). In Transform Magazine (December 2002). "[One] problem is the proliferation of XML vocabularies -- the many initiatives and schemas that have been developed to express data in XML format. These range from EDI-style business documents to scientific data to supporting data and message formats for software development itself. At this point, it would take a team of analysts to develop a comprehensive and accurate catalog of all the XML initiatives out there. The threat of so many competing XML initiatives is that interoperability will become impossible. If n groups devise n ways of expressing, say, genomic data in XML, how will company A and company B end up successfully communicating with company C? The result, some fear, is a 'Balkanization' of the XML community, where competing factions end up speaking different languages. I really don't worry too much about either Balkanization or hype. Why? Because the good news about XML decisively outweighs the bad. In these days of renewed interest in profitability and return on investment, XML-based development is clearly the way to go. To begin with, XML-based development is your best chance to preserve existing systems while bringing business processes to the Web. XML is the lynchpin in Web services -- the best means by which legacy systems can communicate with your users via the Web. The buzz around Web services is significant precisely because it represents the best of both worlds; you can leverage the existing systems you choose to while bringing whole new groups of users to your business via the Web... for example, the U.S. Air Force, which is using XML to drive the electronic distribution of maintenance and flight manuals for E3 AWACS aircraft. The documentation had been maintained for decades on proprietary systems that produced paper output. When the Department of Defense mandated paperless delivery, Crystal City, VA-based military contractor Veridian developed an XML delivery platform using Ixiasoft's TextML Server. The legacy data is still maintained on the proprietary systems, but is also converted to XML for search, retrieval and distribution. The retrieved data is converted to HTML on the fly and displayed on a customized browser. The result is a low-cost system that lets the Air Force preserve its legacy system while delivering up-to-date paperless manuals. Secondly, XML is about intelligent use of local data so it can work with other internal and external systems. The fallacy of the Balkanization argument begins with the fact that your organization likely does speak some of its own language. There are valid business reasons why your version of a purchase order may be different from the external versions you share with partners. You may not want to share all internal data, or at least not share it in the precise form you maintain. The real beauty of XML is the ease with which local data can be merged with or converted to external data..."

  • [December 07, 2002] "The Python Web Services Developer: RSS for Python. Content Syndication for the Web." By Mike Olson and Uche Ogbuji (Principal Consultants, Fourthought, Inc). From IBM developerWorks, Web services. November 2002. ['RSS is one of the most successful XML services ever. Despite its chaotic roots, it has become the community standard for exchanging content information across Web sites. Python is an excellent tool for RSS processing, and Mike Olson and Uche Ogbuji introduce a couple of modules available for this purpose.'] "RSS is an abbreviation with several expansions: 'RDF Site Summary,' 'Really Simple Syndication,' 'Rich Site Summary,' and perhaps others. Behind this confusion of names is an astonishing amount of politics for such a mundane technological area. RSS is a simple XML format for distributing summaries of content on Web sites. It can be used to share all sorts of information including, but not limited to, news flashes, Web site updates, event calendars, software updates, featured content collections, and items on Web-based auctions... RSS was created by Netscape in 1999 to allow content to be gathered from many sources into the Netcenter portal (which is now defunct). The UserLand community of Web enthusiasts became early supporters of RSS, and it soon became a very popular format. The popularity led to strains over how to improve RSS to make it even more broadly useful. This strain led to a fork in RSS development. One group chose an approach based on RDF, in order to take advantage of the great number of RDF tools and modules, and another chose a more stripped-down approach. The former is called RSS 1.0, and the latter RSS 0.91. Just last month the battle flared up again with a new version of the non-RDF variant of RSS, which its creators are calling 'RSS 2.0.' RSS 0.91 and 1.0 are very popular, and used in numerous portals and Web logs. In fact, the blogging community is a great user of RSS, and RSS lies behind some of the most impressive networks of XML exchange in existence. These networks have grown organically, and are really the most successful networks of XML services in existence. RSS is a XML service by virtue of being an exchange of XML information over an Internet protocol (the vast majority of RSS exchange is simple HTTP GET of RSS documents). In this article, we introduce just a few of the many Python tools available for working with RSS. We don't provide a technical introduction to RSS, because you can find this in so many other articles. We recommend first that you gain a basic familiarity with RSS, and that you understand XML. Understanding RDF is not required..." See "RDF Site Summary (RSS)."

  • [December 07, 2002] "XML Takes on a Documentation Nightmare." [Integration & Customization] By Lowell Rapaport. In Transform Magazine (December 2002). ['European pharmaceutical companies have been struggling with a long, drawn-out process that requires hundreds of separate documents to get the go-ahead before just one drug can reach the market. A new initiative spearheaded by GlaxoSmithKline and supported by industry and regulators may replace the paper with streams of XML data.'] "Faced with myriad documents and drawn-out approval processes, pharmaceutical companies have been in the vanguard of XML-based publishing. Nowhere is the need more pronounced than in Europe, where drug companies must generate container labels, inserts, advertisements, product characteristic summaries and other documents in no less than 13 languages. Pharmaceutical companies may have to create and gain approval of more than 600 separate documents for each medication... In hopes of speeding approval and cutting costs while also improving accuracy, GlaxoSmithKline has taken the lead in a Product Information Management (PIM) project. The initiative is supported by both the European Federation Pharmaceutical Industry Association (EFPIA), an industry trade association for pharmaceutical companies, and the European Medicine Evaluation Agency (EMEA), the body that tests and regulates drugs for safety and effectiveness... GlaxoSmithKline enlisted the aid of Arbortext, the Ann Arbor, MI-based maker of Epic XML authoring software. 'The Epic editor offers a user interface similar to Microsoft Word and other word processing programs,' says Laughland. 'With it, we were able to create standard forms and document views familiar to users.' According to Laughland, the Arbortext editor was easily integrated with Oracle IFS, an Internet-based file system that lets users access and revise content online. XML-tagged content is imported into the Oracle database in which the master document is generated. Once complete, the master document is forwarded to the appropriate government agencies where it can be broken up into smaller documents for approval. Oracle IFS supports version tracking of individual pieces of content as well as entire documents. When a change is made to an individual piece of XML-tagged data, only the changed content needs to be sent to government agencies for approval. Since all published documents are derived from the master document, the approval process has been streamlined from the chaos of multiple versions of separate documents. And since all documents are generated on the fly from the master document, updates can be published almost instantly..." See also: (1) the earlier description by Frances Holly ("PIM: European Drug Labeling Goes XML"), presented at XML 2001; (2) related work referenced in "Electronic Common Technical Document (eCTD) for Pharmaceuticals."

  • [December 06, 2002] "Multimodality Starts Walking and Talking." By Ellen Muraskin. In Communications Convergence (December 04, 2002), pages 18-25. ['We witness mobile-phone and Pocket PC demos, sort out confusion over markup languages, note trial deployments, and highlight the platform maker furthest along in supporting apps that work both visually and aurally.'] "The time has come to talk about multimodality. The volume of buzz at speech, telecom, and wireless trade shows has gotten too loud to ignore. Microsoft and IBM have stomped in with all four feet, the former buying network primetime to show the mass-market scenarios of pocket PCs and PDAs you can talk to, talk through, browse on, tap on, photograph, barcode scan, and sing along with - all retrieving and affecting data on the other side of a wireless network. In a string of recent application rollouts - mostly games and messaging apps, thus far -- we see the wireless operators hanging their revenue-per-customer hopes on data... Multimodal apps combine PC- and phone-type input and output (screen, stylus, mouse, keypad, keyboard, depending on device and context) -- with audio and speech recognition. The ultimate goal is to make apps capable of interacting transparently across a wide range of access devices, in whatever modes a user prefers or requires... Kirusa's markup for the simultaneous multimodal banking application is its own variant of VoiceXML-plus-HTML, which they've wryly named 'Pepper.' But, they would hurry to add - and have demoed as well -- their KMMP can administer multimodal applications through SALT markup, too. SALT, for Speech Application Language Tags, was initially promoted by Microsoft and then by the SALT Forum, whose founding members were the Redmond folks plus Comverse, Cisco, Intel, Philips, and SpeechWorks. SALT had the multimodal stage to itself for a while. This was at least partly because VoiceXML made (and still makes) no claims to anything but voice -- but also because SALT had marketing muscle behind it, and because companies such as Kirusa were making no audible noise. While SALT V. 1.0 itself was only received for comment by the W3C last August, it has since had to share developer mindshare with IBM's, Motorola's, and Opera's joint announcement of X + V, or XHTML + VoiceXML. So at this point, we're up to Pepper, SALT, and X+V... SALT is somewhat ahead on the tools front... A bit behind, IBM/Opera/ Motorola's X+V SDK was announced at October's SpeechTEK show. They say they have some pilots going in the U.S., but can't specify carriers. As with their VoiceXML interpreter and speech server, IBM will promote an end-to-end, speech engine-to-web server range of offerings... The SALT-vs.-X+V comparison is best laid out by James Larson, Intel's Manager, Advanced Human I/O, and can be viewed [online: 'VoiceXML, SALT, and XV: Approaches for Developing Telephony and Multimodal Applications']. Larson also lays out excellent cases for when multimodality beats voice or data alone. These can be found in a slide presentation he promises to post at XHTML is an XML-ified version of HTML, but for now let's lump Kirusa's Pepper, i.e., VoiceXML-plus-HTML, together with X+V. In very broad strokes, Pepper looks like recognizable VoiceXML code followed in the same page by HTML code, with linking tags embedded in both. A lot of its visible 'cleanliness' owes itself to the fact that a VoiceXML browser and its forms-interpretation algorithm take care of most of the control and coordination activities, leaving the programmer to specify the prompts, grammars, and event handlers using a declarative style of programming..." See also: (1) "Voice Gateways: Moving To Open Standards With Vocalocity."; (2) "VoiceXML"; (3) "Speech Application Language Tags (SALT)."; (4) "Multimodal Browser Extension for MSIE."

  • [December 06, 2002] "Multimodal Interaction Use Cases." W3C NOTE 4-December-2002. Edited by Emily Candell and Dave Raggett. "The W3C Multimodal Interaction Activity is developing specifications as a basis for a new breed of Web applications in which you can interact using multiple modes of interaction, for instance, using speech, hand writing, and key presses for input, and spoken prompts, audio and visual displays for output. This document describes several use cases for multimodal interaction and presents them in terms of varying device capabilities and the events needed by each use case to couple different components of a multimodal application." See also the W3C Multimodal Interaction Framework document which "introduces a general framework for multimodal interaction, and the kinds of markup languages being considered."

  • [December 06, 2002] "XML Body Creates Global e-Gov Forum. Experts Warn that Committee Will Face Challenges." By John Geralds and Peter Williams. In (December 06, 2002). "A global extensible markup language (XML) standards body has set up a committee to agree standards for e-government - but analysts have expressed doubts that it will deliver. The Organisation for the Advancement of Structured Information Standards (Oasis) - the body responsible for the electronic business XML (ebXML) document interchange format standard - said its e-Gov Technical Committee (TC) will help deliver open, international XML standards to meet the needs of e-government strategies. John Borras, assistant director interoperability and infrastructure in the Office of the e-Envoy, has been appointed the first e-Gov TC chairman. Borras said the committee would provide an excellent opportunity for governments from around the world to have a significant say in future XML standards. The committee will identify and prepare plans for developing new standards, with recommendations formally submitted to appropriate Oasis working groups. Further technical committees may be formed if needed. The committee will create best practice documents to push the adoption of Oasis open standards within governments... But Neil Macehiter, senior consultant at analyst Ovum, warned: 'The challenge is to come up with real deployments. I question what [e-Gov TC] can deliver that is tangible in a timescale to bring real benefits.' He pointed to OASIS's own ebXML standard, originally built on electronic data interchange and still not finally ratified after years of development. The committee is to look at ebXML along with web services, regarding them both as emerging technologies. Macehiter said there was a need to grapple with the issue of government processes, but added: 'The complexity is large and the scope is large. They will need to rein in the scope'..." See details in "OASIS Announces Formation of e-Gov Technical Committee."

  • [December 06, 2002] "E-gov XML Committee Formed." By Diane Frank. In Federal Computer Week (December 06, 2002). "A global standards consortium on December 5 announced the creation of a new technical committee devoted to e-government XML standards, and its membership includes several U.S. federal experts and officials. The e-Gov Technical Committee of the Organization for the Advancement of Structured Information Standards (OASIS) will bring together government officials from around the world to talk about common e-government needs. The recommendations that the committee develops will be submitted to the OASIS working groups developing the standards. The officials who leading the U.S. E-Government Strategy and Federal Enterprise Architecture want to take full advantage of the possibilities that XML brings to online services. But the number of different XML standards, or schemas, is challenging. The OASIS effort will help in several ways, and the U.S. government will fully support it, said Mark Forman, associate director for information technology and e-government at the Office of Management and Budget. 'Perhaps most important, setting global standards for e-government and e-business should enable significant reduction in government's burden on citizens, businesses, and government employees,' he said in a statement. 'Better standards should also improve security and cut the cost of government IT purchases.' OASIS formed the committee to help ensure that XML schemas and emerging technologies relying on XML, such as Web services, are not developed solely for private-sector needs, according to the organization... The chairman of the committee is John Borras of the United Kingdom's Office of e-Envoy, and includes representatives from the U.S. General Services Administration and the Navy; the Danish Ministry of Science, Technology and Innovation; the government of Ontario, Canada; and the U.K. Ministry of Defense. It also includes software developers, such as BEA Systems Inc., Booz Allen Hamilton, Entrust Inc., Fujitsu Ltd., the Logistics Management Institute, Microsoft Corp., Sun Microsystems Inc., and webMethods Inc..." See details in "OASIS Announces Formation of e-Gov Technical Committee."

  • [December 06, 2002] "H.R. 2458, E-Government Act of 2002. Selected Portions of Interest to the XML Working Group." By [ Staff]. Provided by in connection with the Federal CIO Council XML Working Group, maintained by Owen Ambur. November 26, 2002. The text is at: E-Government Act of 2002, Engrossed as Agreed to or Passed by House. See also: (1) "E-gov Agenda Takes Shape. E-Government Act Promotes Web standards, Procurement Reform, Security Policies."; (2) "Senate Passes H.R. 2458, the "E-Government Act of 2002" [cache]

  • [December 04, 2002] "XML Class Warfare." By Uche Ogbuji (Fourthought Inc). In Application Development Trends (December 04, 2002). "It seems there is no escaping class warfare. XML is a young society, but it is already succumbing to age-old divisions. The most direct roots of XML are in the processing of text documents, and XML is fundamentally a text-processing format. Many XML users, however, come from backgrounds in relational databases and object-oriented development. To some, XML is the perfect technology for managing problems of interoperability and data rot that have plagued programmers for years. The XML bohemians are more concerned with the text content of XML data than they are with any class or type that might be associated. I count myself firmly among the bohemians. For a while, these two groups have rubbed elbows in the XML community with a great deal of tension, but with little outright conflict and friction. But this uneasy peace has come to an end. The main battleground is that dutiful engine of so much XML processing, XPath. The gentry, however, would like more class consciousness from these workhorse technologies. They reason that if they have gone to the trouble of specifying in the schema that '1.0' represents a floating point number, the XPath and XSLT processors should make this information available, and the processor should use such type information far more broadly. The gentry take the view that such capabilities should be built into the foundations of XPath and XSLT. The XPath 2.0 and XSLT 2.0 drafts are manifestations of this gentrification. They build in extensive facilities for handling WXSDT... The bohemians have rallied (for the most part) behind a powerful champion -- RELAX NG -- which is an XML schema standard that competes with WXS. It is designed more for document-style and XML than for XML born in programming data. It supports type annotations, but only as separate optional modules (which can include WXSDT). The bohemians insist that the next-generation XML technologies should not only learn from RELAX NG's isolation of class consciousness, but should avoid bias toward WXS, supporting RELAX NG and other alternatives as well. The battle rages on at present. Certainly, if you want your data to outlast your code, and to be more portable to unforeseen, future uses, you would do well to lower your own level of class consciousness. Strong data typing in XML tends to pigeonhole data to specific tools, environments and situations. This often raises the total cost of managing that data. Not everyone will decide to join me in the ranks of the bohemians..."

  • [December 03, 2002] "Web Services Security Core Specification". Edited by Phillip Hallam-Baker (VeriSign), Chris Kaler (Microsoft), Ronald Monzillo (Sun), and Anthony Nadalin (IBM). Working Draft 05. 02-December-2002. 49 pages. A working draft ["subject to change"] from the OASIS Web Services Security Technical Committee. Updated from the (referenced) Version 03 specification of 2002-11-03. Posted to the OASIS Web Services Security TC mailing list 2002-12-03 by Anthony Nadalin. See the TC website for related XML Schema documents and Bindings documents (Web Services Security Kerberos Binding; Web Services Security SAML Token Binding Draft 3; Web Services Security X509 Binding; Web Services Security XrML Token Binding). "This specification describes enhancements to the SOAP messaging to provide quality of protection through message integrity, message confidentiality, and single message authentication. These mechanisms can be used to accommodate a wide variety of security models and encryption technologies. This specification also provides a general-purpose mechanism for associating security tokens with messages. No specific type of security token is required; it is designed to be extensible (e.g., support multiple security token formats). For example, a client might provide one format for proof of identity and provide another format for proof that they have a particular business certification. Additionally, this specification describes how to encode binary security tokens, a framework for XML- based tokens, and describes how to include opaque encrypted keys. It also includes extensibility mechanisms that can be used to further describe the characteristics of the tokens that are included with a message..." See: "Web Services Security Specification (WS-Security)." [source]

  • [December 03, 2002] "XML Takes a Step Forward, Hits Snag on Another Front." By Lisa Vaas. In eWEEK (November 25, 2002). "While support for XML grew last week as IBM released DB2 Universal Database 8 with support for the language, support of a limited query language from a standards group could limit the broad use of XML. With the addition of Extensible Stylesheet Language Transformations, a SQL function for automatic style transformation, DB2 now has about 100 extensions to SQL that are built to support XML data. DB2 has caught up to Microsoft Corp.'s SQL Server 2000 and Oracle Corp.'s Oracle9i in its ability to handle Web services -- and it includes support for a Universal Description, Discovery and Integration registry. More XML support is what DB2 users such as Inc. are looking for as they anticipate business partners' and customers' use of XML. runs a business-to-business e-commerce system that automates purchasing and inventory management for large ambulatory surgery centers, many of which run DB2... Janet Perna, general manager of IBM Data Management Solutions, in Armonk, N.Y., told eWeek that future XML support includes XQuery -- the XML query language -- and native database support for XML in DB2... But at the World Wide Web Consortium Advisory Committee meeting last week, members confirmed that Version 1.0 of the working draft of XQuery will not include support for full-text search operations. As a result, most vendors of document-oriented XML databases will be forced to maintain their existing approaches to queries, which will limit the short-term usefulness of the proposed specification. Nelson Mattos, an IBM distinguished engineer, said a full-text version, which the W3C has developed in parallel, is still on track. 'One goal of developing it in parallel is that they could publish the XQuery portion without it if there was any delay with the full-text version,' said Mattos, in San Jose, Calif. Analysts say that, as the standard becomes more widely used, it has become imperative for relational DBMS vendors to support XQuery..." See: (1) W3C XML Query website; (2) "XML and Query Languages."

  • [December 03, 2002] "Declaring Web applications with AppML. [Exploring XML.]" By Michael Classen. In (November 25, 2002). "Writing Web applications is a lot of work: Creating forms, programming business logic, developing database access. As part of the Mozilla project, the Extensible User Interface Language (XUL) created a lot of buzz around the concept of declaring -- rather than programming -- user interfaces. Instead of writing code in a particular programming language, an XML document contains a description of all the used GUI elements and their connections. Along these lines, fellow XML site author Jan Egil Refsnes has developed an XML vocabulary for defining whole Web applications: Application Markup Language (AppML). While XUL is still awaiting its first use in a major application, Jan has implemented a complete Web application for the Norwegian Handball Federation in a fraction of the time it would take with conventional tools. AppML is built around a handful of concepts that should be familiar to every Web application developer: database, reports, filters, lists, and forms..." [From the documentation: "All future applications will be Web applications. Future applications will run over the Internet. The client will be an Internet browser and the server will be an Internet server.... Future distributed applications will use Internet communication. Clients will request servers with standard HTTP requests. Servers will respond with standard HTTP responses. The connection between the Client and the Server will be stateless (the client will not maintain a permanent connection to the server)... we need an application markup language. With an application markup language we can: create platform independent applications, create compile free applications, create expandable applications, create distributable applications, reach a larger audience, reduce the development costs, reduce the maintenance costs, use vendor independent Web standards, use thin browser clients... AppML applications are described using AppML; AppML applications are executed by Web Services... You describe the elements of your application using AppML. You save the description as an XML file on your Web server, and you ask a Web Service to execute your application. Anytime you want to change your application, you just have to edit your AppML description. Your Web Service will take care of the rest..."]

  • [December 03, 2002] "Rich Client Redux. Action Engine Bolsters the Wireless Experience by Bringing Web Services to Mobile Devices." By James R. Borck. In InfoWorld Issue 47 (November 25, 2002), pages 33-34. ['This solid solution improves wireless application delivery environments, which can lead to attracting and retaining customers. Compression and store/forward reduces requirements for network and bandwidth overhead. Good Web services model enables rapid integration of applications and content.'] "Action Engine Mobile Web Services Platform 2.5 is a standards-based infrastructure for wireless Web services deployment that is following in the path of rich Internet applications... The platform comprises a comprehensive server-side framework for aggregating and managing Web services-derived content and applications. In addition, a number of utilities and applications perform functions such as mobile device backup, usage accounting, and remote application management to enable better administration of devices in the field. On the user-side, Action Engine smartens up the wireless experience by installing a fat client application onto the wireless device, rather than relying on simple browser-based interaction. By pushing the processing load onto the device and using a localized database, more data is kept in hand, reducing the typical flurry of server calls seen in browser-based Pocket IE transactions... Action Engine's XML parser and Web services layer facilitate content aggregation, enabling it to cull content from any Web services-compliant data source or in-house system. Web services can be built and deployed relatively easily for a carrier or enterprise to integrate new vendors and content providers into its stable of content and application offerings. The smart mobile device also gets an update with the installation of a client application of modest footprint. A local XML parser, database, and processing engine are installed to facilitate application execution, XSLT (XSL Transformations) rendering, and communication and synchronization with the Action Engine platform.Communication between the wireless devices and the Action Engine server are compressed and encrypted (using GZIP and HTTPS/SSL respectively) to increase throughput and enforce security. Key strength can be customized to suit the requirements of carrier and application... Action Engine sports an average first attempt at a software development kit. With requisite development server and Pocket PC emulator, the SDK would benefit from tools, graphical or otherwise, to streamline the development process. However, the Java API and XML syntax structure provided everything we needed to begin authoring transactional applications, manipulate XML objects, and take full advantage of client-side resources. Although setup of the extensive system requirements is an involved process, we found the resulting system architecture, with its capacity for fail-over and redundancy, a vital necessity for carrier-grade performance requirements. All told, we found the experience a straightforward one that should present no challenge to seasoned developers..."

  • [December 03, 2002] "Getting Started with XOM." By Michael Fitzgerald. From November 27, 2002. ['XOM is a new Java API for XML, the XML Object Model. XOM has been proposed by Elliotte Rusty Harold, noted XML and Java expert, to provide a simpler API than the existing and popular SAX or DOM interfaces. In "Getting Started with XOM" Michael Fitzgerald provides an introduction to the XML Object Model, and highlights the ease of use and simplicity of the API.'] "Elliotte Rusty Harold's new XML Object Model (XOM) is a simple, tree-based API for XML, written in Java. XOM attempts to build on good ideas from other Java XML APIs -- SAX, DOM, and JDOM -- and to leave behind some of their frustrations. The result is a high-level open-source API that is easy to learn and use, assuming that you are already familiar with Java and XML. Unlike SAX, XOM is written with classes instead of interfaces, making it more straightforward to use. With SAX you must first implement interfaces before you can get it to work. This work is eased somewhat by helper classes like DefaultHandler; but overall, interfaces make programming in SAX somewhat more complex, even though they also make SAX uniform and flexible. XOM's classes provide some flexibility by offering a number of check methods that may be overridden in subclasses. XOM does not stand by itself. It depends on an underlying SAX parser, such as a recent version of Xerces, to handle well-formedness checking and validation. XOM provides a simple interface to a parser, in effect hiding code without much of a performance hit. I like XOM for the same reasons I like RELAX NG: you can pick it up in a snap if you already have a reasonable familiarity with Java idioms. And, like RELAX NG, the more I use XOM, the more I like it. It is well considered and doesn't try to do everything or please everybody... It's worth noting that XOM avoids convenience methods like the plague. But it is flexible enough to allows users to write their own methods in subclasses..." See also Elliotte Rusty Harold's presentation to the NY XML SIG on September 17, 2002: "What's Wrong with XML APIs (and how to fix them)."

  • [December 03, 2002] "Hacking XUL and WXS-based Transformations." By John E. Simpson. From November 27, 2002. ['John offers introductory advice for customizing Mozilla skins with XUL, and suggests a way to use W3C XML Schema and XSLT to do XML transformations.'] "Q: How do I build a new skin for the Mozilla browser? A: The user interface of Mozilla (and Mozilla-based browsers -- such as recent versions of Netscape) is controlled by what you might think of as XML-based 'driver files'. Specifically, the vocabulary used is the XML-based User interface Language or XUL (pronounced 'zool'). The XUL code is contained in files with a .xul extension and is referenced in URIs by way of the Mozilla-specific chrome:// scheme... A typical XUL document consists of the standard XML declaration, a link to a CSS stylesheet, and a root window element. Within the latter might be a mixture of other XUL elements representing various user-interface widgets (pushbuttons, progress meters, textboxes, and so on) and plain old XHTML elements... Q: Can I transform an arbitrary document into a specific vocabulary defined by W3C XML Schema (WXS)?... On XUL, see "XML Markup Languages for User Interface Definition."

  • [December 03, 2002] "RDF Update." By Shelley Powers. From November 27, 2002. ['RDF has recently seen several new draft specifications released by the W3C. Shelley Powers, author of O'Reilly's upcoming book on RDF, has read through the new specs and reports on their highlights.'] "The W3C working group tasked to update and clarify the RDF specification recently released six new working drafts. The group is collecting comments, concerns, and corrections, which will be incorporated into the documents. At the end of November [2002], in preparation for submitting the documents for review as Candidate Recommendations, the working group will begin its final review... Rather than signalling an increase in complexity of the RDF specification, the documents actually clarify it, primarily by separating its different aspects instead of keeping them bundled together in a confusing jumble of syntax, concept, and semantics. Additionally, two of the documents were written with very specific goals in mind: the Test Cases document aids RDF tool developers; the RDF Primer provides an introduction to RDF which is less formal than the specification itself... In this article I examine the purpose and scope of each document. I also highlight some of the significant changes between these working drafts and the original release of the RDF specifications... Note: the RDF documents include [1] RDF Primer (W3C Working Draft 11 November 2002); [2] RDF Semantics (W3C Working Draft 12 November 2002); [3] RDF Vocabulary Description Language 1.0: RDF Schema (W3C Working Draft 12 November 2002); [4] RDF/XML Syntax Specification - Revised (W3C Working Draft 8 November 2002); [5] Resource Description Framework (RDF): Concepts and Abstract Syntax (W3C Working Draft 08 November 2002); [6] RDF Test Cases (W3C Working Draft 12 November 2002). See the W3C Resource Description Framework (RDF) website and local references in "Resource Description Framework (RDF)."

  • [December 03, 2002] "IBM Bets On 'Smart' Computing." By Steve Gillmor and Mark Jones, with Steve Mills (IBM). In InfoWorld (December 03, 2002). ['Steve Mills, IBM's senior vice president in charge of the company's software solutions group, is responsible for shaping IBM's overall software strategy. Mills met with InfoWorld Test Center Director Steve Gillmor, News Editor Mark Jones, Editor at Large Ed Scannell, Lead Analyst Jon Udell, Technical Director Tom Yager, and Senior Editor Tom Sullivan to discuss IBM's grid computing and autonomic capabilities strategies.'] Mills on DB2, Grid, and autonomic systems: "DB2 has had unstructured data support in various forms since 1995. So this just continues that process of schema mapping and additional beta type support within the database. We'll be finally moving to XML as a full natively supported structure on top of the relational table architecture... We are putting grid protocol support into WebSphere in the 5.x time period [and] will be doing pilot projects with customers in 2003. Grid is the topology for deployment and autonomic is a set of capabilities within the products that hopefully reflect themselves consistently in products so you could make systems more autonomic. Autonomic [computing] is all about configuration, it's about self-diagnosis, it's about taking corrective action, self-healing capabilities when there's [a problem]. We put together a taxonomy for describing these capabilities. And what you see is systematically, over time, more and more capabilities appear in products that makes them more autonomic-like in nature. If you decide to deploy those products in a grid environment, the attributes of the products move across to the topology. But it's not topology unique, it's a statement of function. We're in the midst of making all of our log schemas consistent across IBM. Then when you have your log schemas consistent, you [can] have common tracing, common debugging. Once you get common log schemas, then you begin to understand the interaction between products that causes failures, start to develop a collection of [data] that allow you to create probes [and] first-failure data capture... For the system to be autonomic there's a level of interaction between the elements that begins around things like log schemas. My view is the industry needs to move to a standard XML log schema so you can do correlation between products in a heterogeneous environment. How do you solve that problem? You could work on consistency and correlation tools and debugging tools. You could adhere to certain standard structures for being able to run traces across multiple systems [and] probe architectures so that you could in fact do more consistent event tracking across systems. There are a lot of things that could be done at the industry level." Mills on the synergy between Designer and the Microsoft technology XDocs: "I view XDocs as a derivative of OLE, [where] you're trying to just get interoperability between program elements. In the case [of XDocs], you're trying to get linkages set up between document-centric expressions of things. We'll see how they evolve the technology. I don't view it as a threat to Designer. Designer is a forms-based development [environment] and it's based on Lotus Spread, which is a VB-derivative technology [and] looks like VB in terms of its underlying structure. So it's a virtual machine environment in which from a dialog box interface you're crafting forms-based applications. Next year the forms-based Design structure [will] move onto Java and we'll deliver a Java-based forms Designer -- that's the successor to the Domino Designer product that's in the marketplace today. I don't view XDocs in that same context; I view it more as a linking, embedding initiative on Microsoft's part. It's a logical extension of the way they think about portals and what portals are supposed to do, which is personal use against largely document-based data as opposed to transaction-based data..."

  • [December 03, 2002] "Where the Standards Stand." By Gary H. Anthes. In ComputerWorld (November 25, 2002). ['The Web services standards glass is half full, claim vendors and standards groups, but can they support complex business processes?'] "By their very nature, Web services require strong standards for interoperability, security and reliability. Yet so far the standards are immature, and they aren't yet adequate for the most sophisticated business processes. Standards developers and IT vendors generally see the standards glass as half full, pointing out that many companies have already set up useful and workable Web services applications. But many users see it as half empty, saying more remains to be done before they would consider using mission-critical Web services applications. Some even question whether the functions contemplated by some of the standards can be automated at all. Users also say they're confused and worried about what they see as overlap and competition among IT vendors and standards bodies. Web services are a language-neutral, platform-independent way to loosely couple applications across an intranet, extranet or the Internet. They exchange documents, transactions and remote procedure calls using Web-based protocols... [Survey of the major 'Web Services' standards follows] A number of efforts are under way to address two broad areas in which Web services standards are weak or incomplete: security and the accommodation and coordination of business process rules for multiple interacting Web services. These specifications are to be layered on top of the core standards developed by the Reston, Va.-based Internet Engineering Task Force (IETF) and the W3C. For example, IBM, Microsoft and VeriSign Inc. in April proposed a set of security standards under an umbrella called WS-Security, which is now at OASIS..." See the companion article "The Groups That Set the Standards," also by Gary H. Anthes.

  • [December 03, 2002] "IBM Bolts Voice Support Onto Existing Applications." By Carmen Nobel. In (December 02, 2002). "IBM is continuing its push to let customers interact with corporate applications through speech. The software supports VoiceXML and Java and comes standard with support for IBM's Lotus Software division's Notes and Microsoft Corp.'s Exchange e-mail platforms. The software, which includes sample portlets, will be available Dec. 20 for about $60,000 per processor. So far, Cisco Systems Inc. and Nuance Communications Inc. have agreed to support the WVAA software. Cisco plans to integrate it into its IP Communications infrastructure, and Nuance is supporting access to back-end voice servers from its client-side, speech-to-text software. IBM officials said that although Nuance and IBM compete head-to-head on front-end voice software, IBM wants to ensure that WebSphere supports Nuance because that company's speech recognition software is widely used in enterprises. IBM next week will announce WVAA support from fellow voice companies V-Enable Inc. and Voxsurf Ltd., as well as from the professional services company Viecore Inc. Beyond that, it will be up to third-party developers to build portlets that support the WebSphere software. IBM officials said customers can also expect to see voice-enabled portlets for customer relationship management and sales force automation software. While IBM has close relationships with SAP AG and Siebel Systems Inc., those companies have yet to announce plans to support WVAA..." For a beta version of the WebSphere Voice Application Access product, see alphaWorks Voice Portal. See: (1) the news item "IBM WebSphere Voice Application Access Supports VoiceXML"; (2) details in the announcement "IBM Advances Pervasive Computing Strategy With New Software. Voice Portal Technology and Tools Extend IBM Momentum With Device Manufacturers, Service Providers and Enterprises."; (3) VoiceXML references in "VoiceXML Forum."

  • [December 03, 2002] "IBM Boosts Voice Portals." By Paul Krill. In InfoWorld (December 03, 2002). "IBM on Monday announced several products intended to make it easier to build and manage voice portals and also to extend enterprise applications to devices. Included in the announcement was a new version of IBM's mobile database, DB2 Everyplace, Version 8.1, as well as WebSphere Voice Application Access, middleware to simplify building and managing voice portals and extending Web-based portals to voice... DB2 Everyplace 8.1 is a mobile database designed to ensure zero downtime by synchronizing multiple servers. The database provides data encryption at the table level to help prevent unauthorized access to data if a device is lost or stolen. WebSphere Voice Application Access, due to ship Dec. 20, leverages IBM's WebSphere Portal and enables mobile workers to more easily access information from multiple voice applications via a single telephone number. The offering includes IBM WebSphere Voice Server as well as e-mail, personal information management, and sample portlets. It supports VoiceXML and Java, and the Eclipse open-source development tools platform is supported as well. Also unveiled was WebSphere Studio Device Developer 5.0, which assists developers in the creation, deployment, and testing of J2ME applications for cell phones, PDAs, handheld computers, and other wireless and wire-line devices. It, too, supports Eclipse..." See: (1) the news item "IBM WebSphere Voice Application Access Supports VoiceXML"; (2) details in the announcement "IBM Advances Pervasive Computing Strategy With New Software. Voice Portal Technology and Tools Extend IBM Momentum With Device Manufacturers, Service Providers and Enterprises."; (3) VoiceXML references in "VoiceXML Forum."

  • [December 02, 2002] "Multi-Party Business Transactions." By Bob Haugen (Logistical Software LLC, ebXML and UN/CEFACT TMG Business Process Work Group) and Tony Fletcher (Choreology Ltd, OASIS Business Transaction Protocol Committee and UN/CEFACT TMG Business Process and Architecture Work Groups). 'UNCEFACT White Paper'. Version 1.01. 25-November-2002. Principal document is 31 pages; a separate 4-page Appendix (Multi-Party Business Transaction Appendix) sketches an "Alternative UMM Multi-Party 'Transactional Collaboration' Model." Note of Tony Fletcher from the 'business-transaction' mailing list: "The paper shows how the thinking that went into BTP can influence other standards that have come at the need for transactions from a different angle, and how the Business Transaction protocol itself can be used as an 'underlay' for application protocols, even when not initially considered. The paper is relevant to discussions about business process coordination, transactions, choreography, convergence of competing business process standards, and how to handle complex multi-party business scenarios. One goal of our collaboration was to see how much we could harmonize the workings of the developing UN/CEFACT Business Collaboration Protocol (BCP) and the OASIS Business Transaction Protocol (BTP). We found that the two protocols had enough in common so that BTP-compliant software would be able to implement the business collaboration/transaction patterns required by the UMM and which are part of the current Business Collaboration Protocol effort." UN/CEFACT-BCP was derived from RosettaNet and is the metamodel behind ebXML-BPSS. And BTP participants have been working on convergence between BTP and WS-T. So observant readers can see evidence that all of these specifications can be harmonized. For example, comparing BCP and BTP, we found that each specification brought something different to the views of the elephant, and then they had some overlap. BCP focuses on business semantics and coordination, while BTP focuses on transaction completion and recovery. They overlap some on transactions..." Paper Abstract: "We present a business scenario, the Drop-Dead Order, that is best handled as a multi-party electronic business transaction. We present models of such transactions using the UN/CEFACT Modeling Methodology (UMM) and the OASIS Business Transaction Protocol (BTP). We claim that the two models are sufficiently equivalent that the same runtime software could execute either. We recommend that BTP be considered as an underpinning implementation technology for UMM, thus harmonizing two specifications that have been considered incompatible. We suggest that further efforts be made to harmonize the other major proposals for electronic business processes so as to converge from a confusion of incompatibility to one global standard, or at least good interoperability. Along the way, we dip into several other important topics, such as the benefits of transactional behavior, the meaning and scope of transaction contracts, the use of coordinators and commitments to structure complex business collaborations, and the distinction between business coordination and transaction completion..." [cache]

  • [December 02, 2002] "Adobe Ships Graphics, Document Servers." By [ Staff]. In Internet Week (December 02, 2002). "Adobe Systems Monday [2002-12-02] said it has begun shipping its previously announced enterprise-ready document and graphics servers. The Adobe Document Server lets enterprises automatically generate and customize PDF files and forms. The server is built on an architecture to enable integration with existing enterprise systems, such as enterprise resource planning (ERP), customer relationship management (CRM), and content management systems. Developers can use XML, Java, Perl, and COM interfaces to build scripts and program logic for automating the creation of PDF documents. Companies can use the server to dynamically create documents like technical manuals, brochures, proposals, forms, and more. Adobe also shipped Adobe Document Server for Reader Extensions, which lets companies assign usage rights to the PDF documents it dynamically creates. Adobe Document Server pricing starts at $20,000 per CPU; new capabilities include support for EPS and PDF; conversion of SVG files to PDF; enhanced support for image metadata and Adobe Photoshop 7.0 native files and engines; CMYK image manipulation; and clipping path support..." See details in: (1) "Adobe Ships Server Products to Automate Document Processes in the Enterprise. Based on Adobe Acrobat and Adobe PDF, New Server Products Enable Organizations to Reduce Document Production Costs."; (2) "Enhanced Adobe Document Servers Support XML-Based Workflow and Digital Signature Facilities."; (3) "Adobe Announces Immediate Availability of New Graphics Server. Adobe Graphics Server 2.0 Automates Image Production Process for More Effective Business Communications."

  • [December 02, 2002] "Using WSDL in a UDDI Registry." UDDI Spec TC Best Practice document. Reference: Version 1.08, uddi-spec-tc-bp-using-wsdl-v108-20021110. 12 pages. Edited by John Colgrave (IBM) and Karsten Januszewski (Microsoft). Contributors: Francisco Curbera (IBM), David Ehnebuske (IBM), and Dan Rogers (Microsoft). A posting from Luc Clément (Co-chair, OASIS UDDI Spec TC) announced the results of a TC vote which approved version 1.08 of Using WSDL in a UDDI Registry as a Best Practice document. A Best Practice "is a non-normative document accompanying a UDDI Specification that provides guidance on how to use UDDI registries. Best Practices not only represent the UDDI Spec TC's view on some UDDI-related topic, but also represent well-established practice." Using WSDL in a UDDI Registry "describes the current Best Practice for constructing UDDI entities from, or relating to, WSDL descriptions of web services." From the Introduction: "The Universal Description Discovery and Integration (UDDI) specification provides a platformindependent way of describing services, discovering businesses, and integrating business services using the Internet. The UDDI data structures provide a framework for the description of basic business and service information, and architects an extensible mechanism to provide detailed service access information using any standard description language. Many such languages exist in specific industry domains and at different levels of the protocol stack. The Web Services Description Language (WSDL) is a general purpose XML language for describing the interface, protocol bindings and the deployment details of network services. WSDL complements the UDDI standard by providing a uniform way of describing the abstract interface and protocol bindings of arbitrary network services. The purpose of this document is to clarify the relationship between the two, describe how WSDL can be used to help create UDDI business service descriptions." References in: (1) "Universal Description, Discovery, and Integration (UDDI)"; (2) "Web Services Description Language (WSDL)."

  • [December 02, 2002] "Traversing the Tree: Using the 'get_relatedCategories' API in UDDI Services." By Karsten Januszewski (Microsoft Corporation; UDDI Blog). From Microsoft MSDN Library. November 2002. ['Karsten Januszewski explains how and why to use the get_relatedCategories API, an extension API for accessing and retrieving a list of values within a given categorization scheme in a Microsoft UDDI node.'] "The ability to use categorization schemes to classify data in Universal Description Discovery and Integration (UDDI) is key to the UDDI mission of description and discovery. While the ability to categorize UDDI entries is a core part of the UDDI specification, the specification provides no API facility for querying to retrieve a list of values within a given categorization scheme. In order to address this need, Microsoft UDDI nodes (both the publicly available UDDI Business Registry node as well as private UDDI Services nodes available in Microsoft Windows .NET Server 2003) provide an extension API for accessing this information called the get_relatedCategories API. This paper explains how and why to use this API. There are several reasons why one might want to gain access to the set of values of a categorization scheme in UDDI: (1) Writing custom user interfaces to UDDI; (2) Providing categorization metadata when installing applications that register themselves in UDDI; (3) Using UDDI to dynamically configure applications; one scenario for using UDDI is as an abstract layer between Web services and their clients. In particular, clients might query UDDI at run time to determine the best service to bind to. Because those queries may be based on categorization information, the client may need to dynamically navigate a categorization scheme to discover services... The get_relatedCategories API is a powerful enhancement to UDDI provided with the Microsoft UDDI Services shipping with Windows .NET Server 2003. It allows developers to access and traverse the hierarchy of a taxonomy that has been imported into a UDDI Services instance, giving developers the freedom to create applications that take advantage of the UDDI metadata strategy..." Note in this connection the posting by Luc Clément in response to the question "Do we have a schema to describe a taxonomy with all parent-child relationships, possible values and all?" -- [Luc:] "Unfortunately, there is nothing that has been published by the Working Group or this TC to date. This is clearly an area that deserves some attention. To support the needs of Microsoft's UDDI customers, Microsoft has created a schema and an API to navigate and retrieve valued from a given value set. The API is entitled the get_relatedCategories. The schema used to support these value sets is available at" See also "The Importance of Metadata: Reification, Categorization and UDDI." General references in "Universal Description, Discovery, and Integration (UDDI)"[cache .XSD]

  • [December 02, 2002] "IBM Adds New WebSphere Tool for Voice Apps." By Stacy Cowley. In InfoWorld (December 02, 2002). "IBM will release later this month WebSphere Voice Application Access (WVAA), a tool for developers seeking to voice-enable corporate applications for mobile access. WVAA supports VoiceXML (Voice Extensible Markup Language) and Java, and includes sample portlets and preconfigured functions to speed development time for customized voice portals. Developers can use the technology to build voice interfaces for retrieving information from corporate databases and systems, such as stock quotes or customer information. In conjunction with other IBM development tools, WVAA can enable information to be requested with one device, such as a mobile phone, but delivered to another, like a handheld computer. The technology is particularly well-suited to mobile workers, said Sunil Soares, director of product management for IBM's Pervasive Computing Division. IBM has been working with one real estate company using the technology to allow agents to call and retrieve listings using keywords, he said... Nuance Communications, which competes with IBM on voice software, will support WVAA, as will infrastructure provider Cisco Systems. Voice software companies V-Enable and Voxsurf will also support the technology, as will services firm Viecore..." See also in eWEEK: "IBM Bolts Voice Support Onto Existing Applications," by Carmen Nobel. General references for VoiceXML in "VoiceXML Forum."

  • [December 02, 2002] "Office 11 Needs XML." By Timothy Dyck. In eWEEK (December 02, 2002). "Hear that giant sucking sound? It's all the IT data that employees create that gets laboriously created, checked and formatted, and then is never used again. Word processors and spreadsheets are the top offenders here, and whole industries (such as search engine products and content management systems) have grown up to help improve data reuse rates of word processor and spreadsheet files. One very simple reason that databases are so useful is they make data reuse easy. Databases require data to be carefully structured and checked before the data can be stored in the first place, so it's immediately ready for use in another document. The support that Microsoft is building into Office 11, the next release of Microsoft Office to save documents in XML format, promises to do something to remove the deservedly bad-boy reputation that desktop applications have as repositories for corporate data. While the default file format for Office 11 documents will still be the proprietary Office data format, there will be a new option to save Office documents in XML format. (Sun's OpenOffice has done much better in this area so far with its default and well-documented XML format files.) Office 11 will have its own default XML document structure, but a more promising option is the ability to save documents in an XML format of the customer's choice. Using the industry-standard XML Schema language, organizations will be able to define their own tags and then map them to existing or new Microsoft Word or Microsoft Excel files. When employees type data into the files and then save them, the file saved is compliant with the customer's own defined schema..." See "Microsoft Office 11 and XDocs."

  • [December 02, 2002] "E-gov Agenda Takes Shape. E-Government Act Promotes Web standards, Procurement Reform, Security Policies." By Judi Hasson, Diane Frank, and William Matthews. In Federal Computer Week (December 02, 2002). "In one of the most dramatic changes in information technology policy since the passage of the Clinger-Cohen Act of 1996, President Bush is expected to sign into law this week the E-Government Act of 2002, which lays out the rules of engagement for agencies providing information and services online. The bill affects nearly every aspect of IT management and rules. It defines e-government and its basic parameters -- from Web sites for managing crises, to electronic archives and directories that give the public a road map to government information. For the first time, lawmakers earmarked money -- $345 million during four years -- to fund e-government programs. The legislation also extends existing rules for security, outlines new initiatives for training the federal government's IT workforce and creates new buying rules to help drive down the cost of IT... The bill, which comes more than a year after the Bush administration launched its 24 e-government initiatives, does not stake out new territory, federal IT experts said. Instead, it provides a formal management structure for a disparate array of e-government concepts and initiatives. 'It is an accelerator,' McClure said. 'It really is broadening citizen interaction and access to government.' As expected, the bill calls for the creation of a permanent position in the Office of Management and Budget for an e-government administrator, appointed by the president, to develop policies related to e-government. The role is similar to the one played by Mark Forman, OMB's associate director for IT and e-government. Some e-government projects are already under way, from e-grants to e-filing, as part of the administration's 24 e-government initiatives. But the bill will give those projects greater vitality and more velocity, according to experts..."

  • [December 02, 2002] "Liberty Alliance Waves White Flag at Passport." By Peter Galli and Dennis Fisher. In eWEEK (December 02, 2002). "A growing rift among members of the Liberty Alliance authentication project is placing the technology's future in question. At the core of the problem is exactly where to target the single-sign-on technology in the face of stiff and growing client-side competition from Microsoft Corp.'s Passport service. Officials at the Liberty Alliance's founder and chief sponsor, Sun Microsystems Inc., last week went so far as to concede defeat to the Passport authentication service on the Windows platform. 'There is no way we can compete with them there. They have that market tied down really tight,' said Jonathan Schwartz, executive vice president at Sun's software group, in Menlo Park, Calif. For Liberty to compete, a new, pervasive computing client such as a smart phone that is not based on Microsoft software will have to emerge to challenge Windows and Passport, Schwartz said. Such a device would give business and consumer users an alternative for authentication and would be a way for Liberty to come into its own, he said. 'I don't think it will be very long before we have a pervasive non-Microsoft client,' Schwartz said. 'Have you seen the latest cell phones, with color screens and keyboards and cameras? That's the way it'll go.' [...] Another influential Liberty member said that Microsoft may have the lead on the Windows platform, but Passport falls short in the enterprise. 'The true value in single sign-on is in cross-platform, cross-domain interaction, and in that space Microsoft has nothing,' said Deepak Taneja, chief technology officer at security vendor Netegrity Inc., in Waltham, Mass. 'Windows is only one part of the equation. Passport has been a huge failure, really. Microsoft managed to get tens of millions of users to register but only because it's become mandatory'..." See: "Liberty Alliance Specifications for Federated Network Identification and Authorization."

  • [December 02, 2002] "Reality Check: What it Takes to Achieve Standards Success." By Matthew Josefowicz (Senior Analyst, Celent Communications). Paper presented September 18, 2002 at the Insurance Standards Leadership Forum (ISLF), New York City, New York. 27 pages. "Main Benefits of Standards Adoption: (1) Speed integration with external business partners; (2) Speed integration with external technology vendors; (3) Enable internal integration between disparate systems; (4) Save time and effort from developing internal standards; (5) Avoid internal political conflict by adopting external standard... Summary: (1) Standards have the potential to relieve some of the pain of internal and external systems integration. (2) Efficiencies of 20%-30% are typical, and over 50% is not unknown. (3) Potential to save the industry over US $250 million annually in new projects alone. (4) Three major hurdles to adoption: Current lack of adoption, Lack of resources, Lack of faith in current state of standards. All are essentially fears of being first mover..."

  • [December 02, 2002] "Realising XML Benefits in Life Insurance. By Umesh Kumar Rai (Business Analyst, Insurance Practice). Wipro Technologies White Paper. July 27, 2002. 20 pages. "XML is impacting the life insurance industry in a big way. A large number of life insurance companies has taken hold at all major life insurance companies. Most companies are in the process of exploring XML usage and the need for a standard vocabulary in order to facilitate communication. The widespread adoption and success of XML in the industry is due to comes from the proliferation of several different versions of XML that are being created by the vendors, institutions, and other organisations. For the organisations that are looking at XML, the key is to understand the way how the technology works, and the specific business benefits that it can bring. From there, a life insurer can begin to investigateing the technology components it will need in order to implement and support XML within its own operations. This paper takes a close look at how can the life insurance organisations benefit from the use of XML in their technology initiatives for sustenance and to gain competitive advantage in a changing marketplace..."

Earlier XML Articles

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: