Other collections with references to general and technical publications on XML:
- XML Article Archive: [October 2002] [September 2002] [August 2002] [July 2002] [April - June 2002] [January - March 2002] [October - December 2001] [Earlier Collections]
- Articles Introducing XML
- Comprehensive SGML/XML Bibliographic Reference List
[November 30, 2002] "Cornering the Office." By P.J. Connolly. In InfoWorld (November 29, 2002). ['Office 11 beta's collaboration and XML features almost justify the upgrade.'] "... We actually found that although Office 11 Beta 1 contains the usual assortment of prerelease bugs, one or two of the new features do justify an upgrade for a small number of users. Among the enhancements to collaboration features, users of Word-created forms will find it easier to restrict access to selected portions of those documents. For the XML-obsessed, the Office 11 applications will provide easier ways to convert Office file formats into XML and vice versa (see 'Modeling biz docs in XML'). The new release of Office will prod sales of Windows XP, because Office 11 won't run on legacy Win9x/Millenium platforms, or even on Windows NT. It's Windows 2000 or Windows XP, or no upgrade for you. Not that this is entirely a bad thing, given our disdain for the 9x family and its obsolete, security-heedless architecture. But this underscores Microsoft's overwhelming dominance of the desktop and its ability to pressure customers into paying for endless upgrades by one means or another... The improvements to individual applications, although valuable, don't really justify the designation of a major upgrade. For example, Excel, PowerPoint, and Word now support ink markup using a Windows-based tablet PC. Access and PowerPoint 11 now support SmartTags. PowerPoint's 'Package to CD' feature is a noticeable improvement over earlier attempts, and one can finally access Word's thesaurus from inside PowerPoint 11. And as we noted, Word now offers more control over document entry, making it a better form-filler than ever. But the enhanced XML support in the Office 11 Beta applications qualifies the suite as a major upgrade. Word 11 supports XML as a native file format, while Access 11 and Excel 11 are better consumers of XML data sources; all three offer improved support for user-defined XML... Although we don't expect consumers and end-users to rush to Office 11, due primarily to the requirements, Beta 1 shows us that enterprises using XML to transform their business process will reap the most benefit from its improved support for user-defined schemas and better import and export facilities. Some of the collaboration improvements, including selective document protection, can make a difference now. But the impact of others such as STS [SharePoint Team Services] must wait to be seen..." See: "Microsoft Office 11 and XDocs."
[November 30, 2002] "Building Rules into Web Services Applications." By Henry Bowers (ILOG Inc). In Web Services Journal Volume 02, Issue 12 (December 2002), pages 7-8. "This article discusses several ways in which applications can be intelligently partitioned to make the best use of business rules... Companies that haven't carefully segregated rules from other business logic are confronted with two choices: divide the application as best they can, or simply slap a Web services front end on the monolithic application. ... the latter approach is the most common. It doesn't require a lot of effort at a programming level. Such implementations generally place software such as a principal mortgage application module on an intranet as a Web service. Other applications send the module large SOAP-wrapped XML records containing all relevant applicant data for approval. When the software approves the application, it returns to the client the necessary text to drop into loan document templates. This solution is quick, inexpensive, and effective; however, it does not make optimal use of business-logic components. The second approach is to separate business rules from the business logic. Depending on the construction of the applications, this can be fairly easy... other applications can now make use of the same rules... Web services provide a unique opportunity to implement this kind of reusability by carving out sets of rules into callable services. In addition, they encourage the conversion of many standalone data formats into XML, which has the benefit of simplifying overall application integration. Sites that choose this path are often surprised at how quickly benefits become apparent. The separation of rules processing as Web services can be implemented progressively. Complete conversions of applications are unnecessary. Also, once rules are segregated as Web services, new opportunities for their use appear quickly. For example, business analysts can much more easily perform what-if scenarios by rolling hypothetical data through the rules. A common implementation detail of this approach is the migration of rules from a code-based implementation to a business rule engine (BRE), an integral component of a business rule management system. A BRE applies business rules to application data, generally in a highly optimized fashion. However, its real importance is that it enables firms to specify rules in business terms rather than in code. This step permits sites to move rules formulation and testing outside of IT and into the hands of business analysts. These analysts now can formulate rules using their own vernacular, test the results, and go live when they're ready. Meanwhile, programmers are freed from the conversion of business policies into code and so can focus on other portions of the application. By using Web services and a BRE this scenario becomes possible. The rest of the application then interacts with the rule sets, which are now flexible and highly amenable to use in one-off situations. Because of this benefit, BREs are increasingly viewed as a central best practice for the implementation of policy-intensive applications... The rule-vending service is implemented as a Web service with a BRE and an interface to a business-rules repository... The rule-vending service has the ability to return an XML-based file containing the applicable business rules, a link to a Web service that applies a specific set of business rules, or the results produced by executing the business rules on the spot..." See also "Rule Markup Language (RuleML)." [alt URL]
[November 30, 2002] "Building Interactive Web Services With WSIA and WSRP. One Step Toward Making Web Services What They Were Meant to Be." By Eilon Reshef. In Web Services Journal Volume 02, Issue 12 (December 2002), pages 10-16. December Feature Article. ['WSIA and WSRP are new Web services standards that enable businesses to create user- facing, visual, and interactive Web services that organizations can easily plug-and-play into their applications and portals. This article will familiarize you with these technologies and illustrate how they can help your businesses.'] "One of the main promises of Web services is enabling the assembly of Web applications from functional components distributed across multiple locations. However, until now the assembly of visual, rich, interactive Web applications with a cohesive flow and look-and-feel has been a challenge. Custom programming is required to create a user interface tier for each new Web service, resulting in set-up and maintenance efforts that render business initiatives cost prohibitive as the number of components increases. Web Services for Interactive Applications (WSIA) and Web Services for Remote Portals (WSRP) are standards for user-facing, presentation-oriented, interactive Web services intended to simplify the creation of distributed interactive applications. With WSIA and WSRP, Web services include presentation and multipage, multi-step user interaction. This lets service users plug them into sites and portals without custom development and leverage new functionality available in future versions of the service without the need for additional custom development WSIA and WSRP define a set of APIs that allow developers to produce and consume remote interactive Web services. They define three types of actors: (1) Producer: Represents the service provider hosting the remote interactive Web service -- for example, weather.com as a weather service provider; (2) Consumer: Represents the entity integrating the remote service into its Web application, oftentimes using a portal toolkit -- for example, Yahoo Weather or a corporate portal; (3) End User: Represents the person that comes to the Consumer Web site to use the Producer's application in the Consumer's context. In a nutshell, WSIA and WSRP fulfill the following roles:  Define the notion of valid markup fragments based on the existing markup languages, such as HTML, XHTML, VoiceXML, cHTML, etc.;  Provide a set of standard calls to enable a Consumer to request these markup fragments from a Producer based on the existing state;  Supply a set of calls that support the concept of multistep user interaction and preservation of state across those calls. There are four central parts to the WSIA and WSRP APIs: Retrieving markup fragments, encapsulated in the getMarkup() call; Managing state; Handling user interaction, encapsulated in the performInteraction() call; Supporting customization and configuration... WSIA and WSRP are important technologies that help bring the promise of Web services to end users by providing a standard to manage user interaction and application display. They enable business partners to integrate each other's online applications seamlessly, offering a more compelling experience to their customers. These technologies are complex under the hood and can only thrive if vendors deliver on the promise of building tools to manage the complexity. Given the breadth of the specification, it is not expected that companies will develop WSIA and WSRP solutions in-house. It is likely that they will instead rely on tools from vendors, keeping their focus on creating the application functionality. See: OASIS Web Services for Interactive Applications TC website; (2) OASIS Web Services for Remote Portals (WSRP) TC website; (3) WSRP-WSIA TC subcommittee mailing list. [alt URL]
[November 30, 2002] "Modeling Biz Docs in XML." By Jon Udell. In InfoWorld (November 29, 2002). ['Unlocking Office 11's XML features means coming to grips with its data definition language, XML Schema. That won't be easy, but the sooner we start, the better. The future of Web services depends on our ability to model business documents in XML. Yes, XML Schema is complex, but some of the issues are more general. Even experts disagree on the best practices for object-oriented data modeling. Office 11 creates an environment in which we can start to codify those best practices as they apply to ordinary business documents.'] "The good news is that Office 11 supports XML Schema. The bad news is that XML Schema has been described even by XML experts as 'confusing,' 'impenetrable,' 'fuzzy,' and 'as user-friendly as a stick in the eye.' A successor to the SGML/XML DTD (Standard Generalized Markup Language/XML document type definition), XML Schema is a language for writing rules that constrain the kinds of elements that can appear in documents and the ways in which they can be sequenced, grouped, and nested. XML Schema is still a relatively new specification. The W3C Recommendation for XML Schema was published in May 2001. XML parsers that support XML Schema haven't done so for very long, and there is not yet much experience using it. Most people who are adept at defining document structure learned how to do so by writing DTDs. Some of the allergic reaction to XML Schema can, therefore, be chalked up to normal reluctance to learn new skills... Upgrading the word processors and spreadsheets on those government computers to versions that not only can read and write XML, but, more crucially, can enforce rules about datatypes and structures, is part of the solution. Assuming, of course, that such rules can be written, deployed, and unobtrusively applied and maintained over time. 'Therein,' observes Windley, 'lies the rub.' There is very little extant knowledge about how to model unstructured and semistructured data in XML. Unlike SGML, the XML DTD was always optional, because the framers of XML knew there was enormous value in documents that were merely well-formed, even if not valid with respect to a DTD. RSS (Rich Site Summary), for example, the wildly popular XML format for content syndication, has no DTD or schema... One possibility is to infer schemas from example documents. Tools can do this, but so far, not with much sophistication. Microsoft, for example, offers a .Net namespace (Microsoft.XsdInference) that will infer a schema from an XML document, and even refine that schema based on further examples. The results make a useful starting point, and inferencing is a promising technology that can and should evolve, but the fact is that modeling XML data is a complex subject that even the best human experts have yet to codify. XML Schema delivers a much richer set of modeling tools than were available to DTD authors. Learning to use them well is going to be a challenge..." General references in "XML Schemas"; see also "Microsoft Office 11 and XDocs."
[November 27, 2002] "A New Way of Collaborating." By David M. Ewalt. In Information Week (November 25, 2002). ['Standards are being developed with which companies will create business processes.'] "Standards bodies and software vendors are putting the final touches on a number of Web-services specifications that could revolutionize the way companies collaborate. The standards are related to XML, a language used by businesses to model enterprise data that's become an instrumental part of Web services. While the technology that underlies each of the new specs marks up data similarly to XML, its capabilities go far beyond that of XML's. 'This is something weird and different,' says Howard Smith, chief technology officer at Computer Sciences Corp. Europe. 'It's not Web services, it's not the reinvention of workflow, it's not process-management workflow, it's new. It unifies those things. It's like taking the best of every other paradigm and building a nice new model.' BPML, the Business Process Markup Language, is published by the Business Process Management Initiative, a group backed by dozens of major IT vendors, including BEA Systems, CSC, SAP, and Sun Microsystems. It released the first draft of the language in August. Compared with XML, BPML lets users model a company's business processes from top to bottom. Using BPML, a company can define every action in a complex business process--anything from sending a price bid to executing a purchase and shipping goods. If every company used the language to define their processes, the processes would become interchangeable. Two companies that want to team on an order, a project, or a transaction could interact at the process level, not so much trading data as working together to perform different parts of common procedures. Tools based on BPML will do to processes what spreadsheets did to data: let companies treat their processes like easily definable objects that can be changed or linked to other processes with a simple point and click... A consortium of businesses, including BEA, IBM, and Microsoft, has a competing standard, BPEL, or Business Process Execution Language, under construction. The first draft of the BPEL4WS (as in 'for Web services') spec was born in August, when IBM and Microsoft brought together two existing technologies (IBM's WSFL, or Web Services Flow Language, and Microsoft's XLANG) to create the new standard. Paraic Sweeney, IBM's VP for business integration, describes the function of the new specification as similar in scope to technologies like Open Database Connectivity and Java... ODBC and Java Database Connectivity provide a standardized way of talking to a database, independent from the developer that produces it, Sweeney says, and the same thing has occurred in application development and application servers with Java 2 Enterprise Edition... BPEL and BPML are similar in scope, yet having two standards would probably compromise the goal of getting everyone speaking the same language. Sweeney says he's confident the two standards will merge at some point. Others, like CSC's Smith, figure BPML will win... Along with Microsoft, Siebel is pushing BPEL as the key standard for that revolution; versions of the Universal Application Network due next year will support the spec, among other standards. 'This is basically the next generation of computing languages,' Siebel says. 'It's a very exciting idea.' CSC is implementing the new languages into software incrementally, Smith says. Later this year, its e3 architecture will be updated to include a full-featured business-process management engine capable of interpreting BPML..."
[November 27, 2002] "Ipedo Upgrades Native XML-Based Suite." By Lisa Vaas. In eWEEK (November 27, 2002). "Ipedo Inc. next week will ship Dynamic Information Suite 3.2, an update of a native XML-based suite that features virtual document links, full-text XQuery searches, XML indexing and flexible content organization to simplify access to distributed content. The upgrade, originally announced in September, provides a platform for integrating, organizing and managing change for mixed content in portals, custom Web applications and Web services. Virtual Documents is a feature that allows users to define documents consisting of references to any content, in other documents or in other systems, through XML views. The feature allows content to be linked from any point in an XML document. Links can be modified over time as new or updated content comes online, without affecting the actual content or its storage. Full-text XQuery search enables searches that return only those components of a document that match the query, rather than the entire document. Self-managing XML indexes simplify the creation and management of indexes on content. They are automatically managed and optimized based on use and query patterns on individual XML documents and document collections... Pricing is on a per-CPU basis and starts at $75,000..."
[November 27, 2002] "Explore the Web Services Bus, Part 1. Using a UDDI-Based Discovery Mechanism to Make Publishing and Discovering Web Services a Breeze." By Greg Flurry (Senior Technical Staff Member, IBM Software Group). From IBM developerWorks, Web services. November 2002. ['If you've downloaded version 3.2.2 of the Web Services Toolkit from IBM alphaWorks, you already have the Web Services Bus, a framework for constructing Web services processors. In this two-part series, Greg Flurry shows you how to get started using the Bus to get your Web services into production faster and more easily. In this first installment, you'll learn about the Bus's UDDI-based discovery mechanism, and examine an experimental practice that will help automate the process of publishing Web services.'] "The Web Services Bus, available in version 3.2.2 of the Web Services Toolkit at IBM alphaWorks, is a framework for constructing Web services processors. With the Bus, you can create Web service clients, servers, gateways, intermediaries, and more. The Bus goes beyond other Web services frameworks to offer an enhanced environment for conducting dynamic e-business with Web services. It supports service requestors with a client-side on-ramp and supports service providers with a server-side off-ramp. The Bus, through its Web Services Invocation Framework (WSIF) heritage, has WSDL at its core; Web services deployed in the bus off-ramp (called bus services) can be described with WSDL, and the off-ramp can dynamically generate WSDL for bus services. Further, the Bus off-ramp optionally publishes information about bus services in UDDI registries. In addition, the Bus supports formalized, pluggable discovery and selection mechanisms in the Bus on-ramp. Briefly, the Bus accepts requests for a particular implementation of a WSDL portType, but invokes the discovery and selection mechanisms to direct the requests to different implementations. The pluggable discovery and selection mechanisms can use any means desired to find candidate implementations and subsequently select one desired implementation from among the candidates... [You will] learn about a number of aspects of the Web Services Bus related to WSDL, UDDI, and Bus discovery. In particular, how to create a UDDI registry-based discovery mechanism, and how to leverage an experimental practice implemented by the Bus to do efficient searches in a UDDI registry for services implementing a particular portType. A more sophisticated discovery mechanism could examine other UDDI registries, WSIL registries, databases, or other sources of candidate implementations for the portType. Similarly, though the sample code selects a service at random, a more sophisticated selection mechanism could make an intelligent choice, perhaps based on cost or service level agreement terms, if such information were available. One final thought: Though the discovery implementation in this example ran in the client on-ramp, similar techniques can be used in the Bus's server on-ramp. This would allow the Bus server side to be configured to do interesting things as an intermediary, similar to the Web Services Gateway, also available on alphaWorks... In a future article, I'll examine bus filters, another feature of the Web Services Bus that enhances the Bus's abilities as a framework for Web services..." General references in (1) "Web Services Description Language (WSDL)"; (2) "Universal Description, Discovery, and Integration (UDDI)."
[November 27, 2002] "IBM developerWorks Web Services Demos." By [IBM Staff]. From IBM developerWorks, Web services. November 2002. ['Since its first release in 2000, the IBM Web Services Toolkit has become one of the most popular downloads from the IBM alphaWorks site and one of the leading Web services implementations. The Toolkit has led the way in demonstrating emerging technologies, allowing people an unprecedented glimpse into the future of Web services. Due to its popularity, we have decided to include some of the demos and their matching documentation through our Web Services Demo Collection, allowing you to try out the Toolkit before downloading and installing it on your system.'] "The Web Services Toolkit (WSTK) for dynamic e-business is a software development kit that includes demos and examples to help in designing and executing Web service applications that can automatically find one another and collaborate in business transactions without additional programming or human intervention. Simple examples of Web services are provided, as well as demonstrations of how some of the emerging technology standards, such as SOAP, UDDI, and WSDL, work together. The WSTK is based on top of the stable development platform and runtime environment of the IBM WebSphere SDK for Web Services (WSDK) also available from developerWorks. The WSTK showcases emerging technology in Web services and is a good way to get an understanding of the most recently announced Web services specifications. However, for a product-level development environment for Web services, IBM WebSphere Studio Application Developer is recommended. It allows creation, building, testing, publishing, and discovering of Web service-based applications that support UDDI, SOAP, WSDL, and XML. An overview of the WSTK objectives and relationship to IBM products is available in an article about WSTK on the IBM dynamic e-business site. A description of the new features added to WSTK is located in the FAQ document on alphaWorks... There are several Web Services Toolkit demos to choose from. Each demo provides a link to overview information and directions for running the demo..." See details in: (1) the related article "IBM Web Services ToolKit: A Showcase For Emerging Web Services Technologies," by John Feller (Senior Development Manager, IBM Emerging Technologies Development); (2) Updated IBM Web Services Tool Kit (WSTK) Version 3.3.
[November 27, 2002] "XML Content Standard Could Challenge Microsoft." From [Analytical Source] Rita Knox. Gartner FirstTake Report. Reference: FT-18-9097. 25 November 2002. ['The Organization for the Advancement of Structured Information Standards (OASIS) seeks to advance open XML-based file specifications for office applications. If OASIS succeeds, Microsoft will face greater competition.'] "Although the outcome of the OASIS initiative may be interesting and useful, it will be incomplete, because Microsoft is not a participant. Microsoft has a virtual monopoly in the office application market. The OASIS committee's aim is to establish standards for data interoperability among applications such as word processing, spreadsheets, charts and graphs, while retaining high-level information for editing. If it achieves this aim, content (regardless of form) will be created in any application, opened from within any other application and edited as needed. The eventual goal is to enable interchange among any type of application, including databases, search engines and Web services. Vendors are backing the initiative because they have a vested interest in loosening Microsoft's stranglehold on the market -- particularly in the face of Microsoft's XML-based Office 11, which is scheduled to be launched in mid-2003. Microsoft's absence also may be due to OASIS's intellectual property policy for participation (royalty-free, and reasonable and nondiscriminatory licensing), which may make Microsoft reluctant to participate when it is about to release its XML-aware product suite. The involvement of Boeing is significant because it provides user endorsement and strong user input to the committee's work. Gartner will closely monitor progress on this standard's activity..." See references in: (1) "OASIS Technical Committee for Open Office XML File Format"; (2) "XML File Formats for Office Documents." [Also in HTML format]
[November 27, 2002] "Web Services We'd Like To See." By Timothy Appnel. From O'Reilly Network Weblogs. November 26, 2002. "In a recent CNET article ['Amazon, Google Lead New Path to Web Services'], Margaret Kane reports on Google and Amazon's success with Web services and the benefits they are beginning to reap. Tim O'Reilly provides his commentary on the piece [in 'Why Amazon and Google Web Services Matter']. O'Reilly notes a key takeaway from Amazon and Google's success is "...the importance of a decentralized approach rather than a top-down approach by a single vendor." In addition to his comments, I think it's also interesting to note that two service providers are driving real Web service adoption and not software vendors such as Microsoft and IBM. (Could this be an indication of a significant shift in the industry?) Being a developer I want to see more service providers support general use Web services. I've been pondering this for a while, but the recent CNET article and the attention it has received put my thinking into high gear. Here are some of the service providers that I would like to see follow Google and Amazon's lead: [eBay, PayPal, FedEx, UPS, MapQuest, Yahoo, Any site with information and news lacking RSS support...]
[November 26, 2002] "Message Queuing for Business Integration." By Dieter Gawlick (Oracle, Server Technology Division). In EAI Journal Volume 4, Number 10 (October 2002), pages 30-33. ['Given the title, you may think this article was first published a few years ago. But, in fact, it looks at database-driven MQ -- specifically, the reasons that companies may choose to move their communication data out of file-based message queuing systems and into database-enabled message queuing systems.'] "In response to increasingly demanding communication requirements, message queuing technology has become a central part of today's computing infrastructure. While most communication is still based on e-mail technology, the increased demand for structured communication with high service quality levels has led more companies to embrace and extend the use of message queuing technology. Message queuing technology has rapidly evolved to meet the challenges of companies integrating their businesses... Probably the best-known and most widely used example of a filebased message queuing system is IBM's WebSphere MQ (formerly MQSeries). In database-oriented message queuing systems, one or more queues are mapped to tables and messages are mapped to database records, normally called rows. This approach to message queuing is much more recent. Either the file- or database-oriented approach can handle the basic message queuing requirements. Both methods provide support for a message format and methods for sending and receiving messages... With file-based message queuing, creation and consumption of messages transactionally consistent with database updates require: (1) Support for distributed two-phase commit coordination; (2) Journaling; (3) Journal management; (4) Recovery. The cost of distributed two-phase commit processing for the synchronization with the companion database can diminish performance and scalability. While the technology is well-understood, the actual implementation proves to be extremely difficult and expensive. Only leading file-based message queuing systems provide transactional support for message queuing. In contrast, transaction support is inherent in the database. With a databaseenabled messaging system, messages can be transactionally stored in queues mapped to database tables. These operations can be committed or rolled back as part of a transaction that includes other standard database operations. By collocating the operational data with the queues, the need for distributed, two-phase commit is eliminated... Autonomous applications perform tasks that need to be coordinated with tasks performed by other applications within the same or in different companies. Message queuing, one of the core technologies for this task, has to provide the communication in a highly reliable, scalable, secure manner while providing auditing and tracking. Similar requirements are found in other environments such as business-to-consumer (B2C) and peer-to-peer. For many of the same reasons that companies have chosen to move their operational data out of files and into a database -- reliability, security, scalability, performance -- they now can choose to move their communication data out of file-based message queuing systems and into database-enabled message queuing systems..."
[November 26, 2002] "Group Calls For New Disaster Warnings." By Robert Lemos. In CNET News.com (November 26, 2002). "The diffuse emergency warning systems in the United States need a revamp, which should include a mandated messaging standard, a panel of emergency-response experts concluded in a report... The panel -- formed of experts in disaster response from the government, the academic and the private sectors -- maintained that the current hodgepodge of warning systems, including the Emergency Alert System and the NOAA Weather Radio, don't work well. 'While many federal agencies are responsible for warnings, there is no single federal agency that has clear responsibility to see that a national, all-hazard, public warning system is developed and utilized effectively,' stated the Partnership for Public Warning in the report, which called for the newly formed Department for Homeland Security to take charge... The new system must be able to communicate with a variety of devices, including computers, cell phones, pagers and any new electronic gizmo, stated the report. For that reason, the report highlights XML (Extensible Markup Language), as a likely candidate, but other protocols might be desirable for noncomputing platforms. In addition, the report says messages must be able to have a unique identifier, a way to specify geographic regions for different levels of warning, and encryption methods for validating the sender of the message. The panel also called for additional research into the efficacy of such warnings, into the extent of trust that can be placed in the public as a source of information about disasters, and into the effect of a great number of false warnings on the public. Currently, a variety of alerts can be sent out using several different systems. The National Weather Service warns of dangerous weather patterns and incidents in specific areas of the country, while the U.S. Geological Survey alerts affected parts of the country of earthquakes, volcanic eruptions and landslides. The Department of Justice issues notice of criminal activities, and the Environmental Protection Agency sends out warnings concerning air and water quality..." General references in "XML and Emergency Management."
[November 26, 2002] "Adobe FrameMaker 7.0." By Robert J. Boeri. In EContent Magazine (November 2002). EContent Decision-Maker Review. "FrameMaker's support of XML, a Web-capable subset of SGML, is new in version 7.0. The World Wide Web Consortium (W3C), developers of XML, released the original XML standard in February 1998. This review emphasizes FrameMaker's XML capabilities rather than the older (and less used) SGML standard... FrameMaker 7.0 comes bundled with Quadralay's WebWorks standard edition, a tool for producing HTML (optionally styled with cascading stylesheets), outputs to MS Reader and Palm Reader, and XML styled either with cascading stylesheets or XSL, the family of W3C XML formatting and transformation standards. FrameMaker's major new features are: Full support for XML 'round tripping,' assuring valid XML both into and from FrameMaker. Support for three sample XML applications: DocBook 4.1, xDocbook 4.1.2, and one of the three XHTML DTDs; CSS Export, which allows automatic generation of CSS style definitions for XML files; expanded support for SVG graphics -- another Adobe strength -- as either raster or pass-through graphics in their original XML renditions; ability to generate Acrobat 5 'tagged PDF.' This can enhance document accessibility by allowing flexible reflows of text on a broad range of reader devices; support for Unicode, 8- and 16-bit UTF characters. This can eliminate the need to buy special Asian-language printing devices; and extensive support for non-English fonts, for example: double-byte for Japanese, simplified and traditional Chinese, Korean, as well as many European and Scandinavian languages. There are no XML analysis and design tools bundled with FrameMaker. To design XML stylesheets or DTDs, you must purchase separate XML suites such as Altova's XML Spy or Tibco's Turbo XML. There is also no FrameMaker integration with those tools. Adobe provides the best solution for long and richly structured technical documents, and Adobe has an undisputed track record for supporting detailed layouts and multilingual fonts. The closest competitors are hybrid XML Word offerings (such as BroadVision's One-to-One Content) and native XML publishing offerings such as Arbortext's Epic and Corel SoftQuad's XMetaL. Because native XML publishing systems immediately recognize DTDs (or schemas), mapping files are not necessary with them..." See also the review in "XML Powers Document Builders."
[November 26, 2002] "Seeds of SVG Being Planted: Will they Grow?" By Luke Cavanagh. In The Bulletin: Seybold News and Views On Electronic Publishing Volume 8, Number 9 (November 27, 2002). "E-periodicals are a growing blip on the publishing radar screen these days, though their ultimate success is still a point of contention. Some believe Microsoft's efforts to make the Tablet PC a reading platform could spur sustainable interest; others think the future will remain cloudy until significant all-around improvements are made. Still, vendors search for a formula that would make e-publications -- magazines, newspapers, newsletters, journals, etc. -- a primary reading choice for the mass market rather than a digitized supplement to print editions... Two companies, Mattercast and Texterity, recently introduced early-stage software based on Scalable Vector Graphics (SVG) that can easily reproduce print layouts in online editions without the need to use the Adobe Acrobat Reader. The vendors tout faster download speeds, optimization of graphics for the Web and the ability to customize the look and feel of the content's packaging as advantages of using SVG instead of plain PDF. Although it is important to realize that the efforts of both Texterity and Mattercast are in their early stages, each thus far provides little compelling evidence that supports the use of SVG in such applications... The Web would really benefit from increased use of vector graphics, and we see great potential for publishers of all types to embrace SVG in their online publications. For the right type of material, vector graphics load faster, look better, and are much easier to make interactive than their raster counterparts. Unfortunately, drawing e-renditions of printed pages is not the right application for SVG..." On Texterity's SVG support in TextCafe, see: (1) the announcement "TextCafe Version 3.0 Now Supports SVG. Generates Scalable Vector Graphics, Enhanced Navigation XML, Document Indexes."; (2) the online presentation "TextCafe 3.0 Release Overview Scalable Vector Graphics (SVG)." General references in "W3C Scalable Vector Graphics (SVG)."
[November 26, 2002] "Perspective: The Five Biggest Myths About Web Services." By Bob Sutor (IBM). In CNET News.com (November 26, 2002). "Two-and-a-half years into the evolution of Web services, and the hype surrounding this technology has become deafening. The good news is that after you strip away the bravado, there is still a lot to be excited about. Web services is starting to pay early dividends for some companies, but the big payoff is still two or three years down the road. By then, Web services will be the standard for doing business with customers, suppliers, and partners. Right now, the trick is separating reality from fantasy. With that in mind, here are five big myths and facts about Web services that can help guide you from illusion to truth... Web services is the distillation of knowledge and experience gained from decades of working with distributed technologies. Essentially, Web-services technologies allow businesses to share the information they have stored in their computer applications with other applications in the company or with those run by customers, suppliers and partners. By connecting these processes online, companies can significantly increase the efficiency -- and thus lower the cost -- of running their enterprise... Web services exists because interoperability is not only possible; it's happening on IT systems every hour of every day. Interoperability is managed through middleware, software that allows applications to run and communicate on multiple operating systems. The adoption of open standards by more and more companies means that this middleware will allow IT systems to interact seamlessly, no matter what operating systems or applications are used... When we log on to the Internet for personal use, we don't think about whether our software will be compatible with whatever is on the other end of our web browsers. The important standards work at the World Wide Web Consortium (W3C) made this happen for the Internet. Why should this not be true for business applications as well? Companies and organizations around the world are cooperating on standards and programming languages, such as Java, that can enable all software, including older applications, to interoperate productively on the Web and to be upgraded or transitioned rather than 'ripped and replaced.' There are honest differences of opinion on how all of this should be accomplished, but overriding the differences is a spirit of cooperation driven by an industry desire to make Web services work. That's why the important Web services work in the Organization for the Advancement of Structured Information Standards (OASIS) and the W3C is proceeding so well..."
[November 26, 2002] "XML Spy Tops as XML Editor." By Timothy Dyck. In eWEEK (November 25, 2002). "Altova GMBH's XML Spy has long been a strong player in the XML space, and Version 5 of the XML editor raises the bar even higher. Of all the XML editors eWEEK Labs has seen -- and we've seen a lot -- the $990 XML Spy 5.0 Enterprise Edition provides the best overall combination of editing power and usability, along with wide database and programming language integration support. This earns the product an eWEEK Labs Analyst's Choice award. XML Spy's user interface -- particularly its graphical schema editing tools and grid-based XML data editor -- keeps impressing us with its versatility, intuitiveness and power. For quick, ad hoc XML transformations, such as converting a series of attributes into elements, XML Spy is a perfect tool. The Enterprise Edition of XML Spy is new to the product line. It includes cutting-edge HTML-to-XML conversion capabilities; Java and C++ code generation; and Web services features, including a Simple Object Access Protocol debugger and a graphical WSDL (Web Services Description Language) file editor. The $399 Professional Edition of XML Spy does not have these features but does include the XML and XML Schema editing features, database import and export capabilities, and the XSLT (Extensible Stylesheet Language Transformations) debugger found in 5.0... The XSLT debugger in the new XML Spy line is important not only because it will be highly useful to developers, but also because it was one feature the competition had that XML Spy didn't. Excelon Corp.'s Stylus Studio 4.5, for example, has a very effective XSLT debugger... Also significant in XML Spy 5.0 is a new feature that helps automate the conversion of an HTML-based site to one that is based on XML technologies. XML Spy accomplishes this by transforming XML source data through Extensible Stylesheet Language into HTML... Plus: HTML-to-XML conversion capabilities; XSLT processor and debugger; graphical WSDL editor; Java and C++ code generators for XML data structures; Web services debugger; powerful XML editing features. Minus: Lacks support for DB2 and Sybase Adaptive Server Enterprise XML database extensions." For schema description and references, see "XML Schemas."
[November 25, 2002] "XML Common Biometric Format (XCBF)." OASIS TC Working Draft, Version 01. Sunday, 17-November-2002. 81 pages. Edited by Phillip H. Griffin (Griffin Consulting). Produced by members of the OASIS XML Common Biometric Format (XCBF) TC. Contributors: Tyky Aichelen (IBM), Ed Day (Objective Systems), Dr. Paul Gérôme (AULM), Phillip H. Griffin (Chair, Griffin Consulting), John Larmouth (Larmouth T&PDS Ltd), Monica Martin (Drake Certivo), Bancroft Scott( OSS Nokalva), Paul Thorpe (OSS Nokalva), and Alessandro Triglia( OSS Nokalva). "Biometrics are measurable physical characteristics or personal behavioral traits that can be used to recognize the identity of an individual, or to verify a claimed identity. This specification defines a common set of secure XML encodings for the patron formats specified in CBEFF, the Common Biometric Exchange File Format (NISTIR 6529). These CBEFF formats currently include the binary biometric objects and information records in two ANSI standards. These XML encodings are based on the ASN.1 schema defined in ANS X9.84:2002 Biometrics Information Management and Security. They conform to the canonical variant of the XML Encoding Rules (XER) for ASN.1 defined in ITU-T Rec. X.693, and rely on the same security and processing requirements specified in X9.96 XML Cryptographic Message Syntax (XCMS). Values of the Biometric Information Record (BIR) defined in ANSI/INCITS 358-2002 - Information technology - BioAPI Specification can be represented in the X9.84 biometric object format can also be represented using XML markup and secured using the techniques in this standard. This standard defines cryptographic messages represented in XML markup for the secure collection, distribution, and processing, of biometric information. These messages provide the means of achieving data integrity, authentication of the origin, and privacy of biometric data in XML based systems and applications. Mechanisms and techniques are described for the secure transmission, storage, for integrity and privacy protection of biometric data." Sources: ZIP archives for the v01 spec and schemas. General references in "XML Common Biometric Format (XCBF)."
[November 25, 2002] "Web Services Security XCBF Token Profile." Edited by Phillip H. Griffin (Griffin Consulting) and Monica J. Martin (Drake Certivo Inc). Submitted to the OASIS Web Services Security TC (WSS). OASIS TC Working Draft. Version 1.0. Monday, 25 November 2002. 15 pages. From the Introduction: "This document describes the use of XML Common Biometric Format (XCBF) cryptographic messages within the WS-Security specification. XCBF messages are validated against an ASN.1 schema. This schema definition language is used to define X.509 certificates and CRLs, and the cryptographic messages used to secure electronic mail in RFC3369 and X9.96 XML Cryptographic Message Syntax. In an instance of communication, XCBF messages may be represented in a compact binary format or as well-formed XML markup. A common XCBF security token is defined to convey and manage biometric information used for authentication and identification. Each binary representation of an XCBF message has an XML markup representation. Both representations share the same schema. This characteristic allows XML markup to be used in resource rich environments, but transferred or stored in a compressed binary format in resource poor environments, e.g., smart cards, wireless and remote devices, and high volume transaction systems. XCBF messages may include digitally signed or encrypted information. The binary format used to represent XCBF messages relies on the canonical Distinguished Encoding Rules (DER) used to encode X.509 certificates. The XML markup format used in this Standard is the canonical variant of the XML Encoding Rules (XER)." See also: (1) OASIS XML Common Biometric Format (XCBF) TC website; (2) "Web Services Security Specification (WS-Security)"; (3) general references in "XML Common Biometric Format (XCBF)."
[November 25, 2002] "XMP: The Path to Metadata Salvation? [Content Management.]" By Bill Rosenblatt. In The Seybold Report Volume 2, Number 16 (November 25, 2002). ISSN: 1533-9211. ['Everyone admits that metadata is central to managing digital publishing workflows and distribution processes. XML and the related RDF gave us a common syntax for representing metadata, but they can't solve the whole problem, which includes communicating metadata between systems and standardizing the schemas. That's where Adobe's Extensible Metadata Platform comes in, allowing applications to capture metadata as digital assets are being created and allowing files to carry their own metadata wherever they go. What XMP can't provide is widespread user agreement on consistent sets of values for metadata elements.'] "Now that XMP [Extensible Metadata Platform] has celebrated its one-year anniversary, it's a good time to re-evaluate it... The current version of the XMP spec is 1.6, dated June 2002. Adobe has incorporated XMP support into the following applications: Acrobat 5, Illustrator 10, InDesign 2, InCopy 2, GoLive 6, LiveMotion 2, FrameMaker 7, Photoshop 7, Adobe Graphics Server (née AlterCast), and Adobe Document Server. For all of these applications, Adobe has built in a standard XMP dialog box that allows users to enter and edit the metadata for each file. In addition to the native file formats used by the above applications, Adobe has tools for embedding and extracting XMP packets in generic JPEG, GIF, EPS and TIFF file formats... Adobe needs to continue to walk the fine line between too little and too much detail in its metadata schemas. Right now, XMP is an exemplar of a lean, tightly conceived standard with very little excess baggage. Adobe will need to fend off the inevitable pressure to bloat XMP's out-of-the-box schemas with extra elements just to satisfy one constituency or another. At the same time, Adobe will need to ensure that XMP is maximally compatible with existing segment-level metadata standards, especially those like NewsML and ONIX that are not based on Dublin Core. Adobe could accomplish this by working with the groups in charge of those standards and developing mappings (aliases, in XMP parlance) to those other standards. Will XMP finally solve the metadata problem? Well, not entirely. At the heart of the metadata problem [...] is the need to create values of metadata. As mentioned before, some of these are easy to create automatically, while others involve manual labor. Many publishers find it hard to get people in the habit of creating metadata, and even when they do, they may not do it consistently; for example, they may define the same terms with different keywords. General solutions to this problem are unknown, yet tools like taxonomies, controlled vocabularies and categorization technologies can help minimize the problem. The latter have reached the point where they can really help automate metadata-creation processes and make metadata more consistent. As more and more publishing processes go digital, there's a greater understanding of the nature of metadata as the key to process efficiencies and product value-add; and along with that understanding have come a growing number of vendors who continue to chip away at the metadata problem. The problem will likely never be entirely solved, but with XMP, Adobe is poised to take the biggest chunk out of it since the invention of XML..." See: "Extensible Metadata Platform (XMP)."
[November 25, 2002] "What is an RSS Channel, Anyway?" By Mark Nottingham. Reference posted to the RSS-DEV Mailing List. November 25, 2002. [A working document that provides an 'examination of the uses of RSS channels, to better understand their nature and to move towards a rigerous definition of them.'] "... The headline syndication view of RSS is straightforward; a channel is a reverse-chronological sorted list of links to stories, along with metadata for each one indicating the title and sometimes a description. This is the model that most developers and users of RSS have in mind to this day, and it is a useful one. However, a few people quickly noticed that it was easy to extend RSS to say other things about those links; in fact, you could say almost anything just by adding an element as a child of <item>. This metadata summary view of RSS ('Rich Site Summary' or 'RDF Site Summary') treated the channel as a container for any kind of statement - from market data to mp3 playlists to event calendars or even order systems - as long as what they were talking about could be arranged in a list. The modularity of RSS1.0 enables its use in a variety of contexts, from Wall Street to Open Source software distribution. Last but not least, Weblogs have been using RSS for something completely different - content syndication. Instead of just saying things about the channels' links, they reproduce the content at the other end, so that a Web resource can be replicated in whole in an aggregator or on another site. All of these views of RSS use the channel to group items together. None of them, however, establish what a channel actually is. In other words, although items are somewhat well-understood (having identifiers and metadata associated with them), the relationships between the items, in the context of a channel, hasn't been explored so much as it has been assumed..." [Note from the 2002-11-25 post: "The discussion on [RSS] 'items' is very timely; for some time I've been trying to figure out what a channel is, and have spent a little time recently writing it down. I'm not done yet, but the document captures the current state of my thinking and discussions I've had... Comments welcome..."] See "RDF Site Summary (RSS)."
[November 25, 2002] "The Undecidability of the Schematron Conformance Problem." By Robert C. Lyons (Unidex, Inc). November 2002. "... The Post Correspondence Problem (PCP) is an undecidable problem... The proof that the PCP is undecidable can be built upon the undecidability of the Halting Problem. The undecidability of the PCP has often been used to prove the undecidability of other problems. It occurred to me that the undecidability of the PCP could be used to prove the undecidability of the Schematron Conformance Problem. In this paper, we'll describe the Schematron Conformance Problem (SCP) and Post Correspondence Problem (PCP). We'll then prove that the Schematron Conformance Problem is undecidable (even if we restrict ourselves to Schematron schemas that don't use the document and key functions). We assume that you are familiar with the basics of Schematron, which is a powerful XML schema language in which validity constraints are defined using XPath expressions... To prove that the SCP is undecidable, we show, by way of contradiction, that if the SCP is decidable, then the PCP is decidable..." Summary: "we showed that the Schematron Conformance Problem (SCP) is undecidable. This is true even if we restrict ourselves to Schematron schemas that do not use the document and key functions of XPath. To prove the undecidability of the SCP, we showed that if the SCP is decidable, then the Post Correspondence Problem is decidable; however, the Post Correspondence Problem is undecidable. Therefore, the SCP must be undecidable. The fact that the SCP is undecidable means that it's impossible to build a Schematron schema editor that evaluates the user's schema and always lets him/her know whether or not there are any XML documents that conform to the schema..." Note from Lyons in posting to XML-DEV: "Recently, I was reading about the Post Correspondence Problem (PCP), which is undecidable (i.e., unsolvable). I realized that the undecidability of the PCP could be used to prove that the following problem is undecidable: Given a Schematron schema, is there an XML document that conforms to this schema? If you're interested in reading the proof and a description of the PCP [see the paper]." See: "Schematron: XML Structure Validation Language Using Patterns in Trees."
[November 25, 2002] "The Significance of CPPA v2." By Jon Bosak. Posting to 'ebXML-DEV' list 2002-11-23. "With the approval of ebXML CPPA v2 as an OASIS Standard this month, I think it would be useful to reflect for a moment on the significance of this milestone for ebXML and for electronic commerce in general. The standardization of OASIS ebXML CPPA v2 is of epochal importance for two reasons. First, the technology itself is central to the XML realization of the EDI trading model and, beyond that, to the implementation of large-scale B2B projects in general. It will be recalled that ebXML CPPA was originally developed by IBM as tpaML, the Trading Partner Agreement Markup Language. Some idea of the importance of IBM's contribution of tpaML to the ebXML initiative in early 2000 can be gained from an excellent IBM white paper on tpaML... a [related] presentation [from Martin W. Sachs] gives a good idea of the central role that tpaML was intended to play in IBM's plans for ecommerce and the similar role that CPPA will play in the ebXML ecommerce infrastructure.... The second reason that CPPA v2 will be of central importance for B2B has to do with the status of the intellectual property related to its implementation. ... [an IBM] patent [#6,148,290] covers features that might be essential to the implementation of a wide variety of ecommerce systems, not just ebXML. It's significant, therefore, that in May 2002, IBM promised to grant royalty-free licenses to all IBM patents required by CPPA v1 and v2 and committed to continue providing RF licenses for subsequent versions of CPPA. [See:] New IBM IP statement, IP statement from IBM, IBM OASIS ebXML Intellectual Property Disclosure, May 16, 2002. The royalty-free license was later extended to open-source software as well. Much has been made of the fact that CPPA v2 still cannot be considered completely unencumbered, but it remains, to my knowledge, the only application of the technology covered by the relevant IBM patents that IBM has publicly pledged to license royalty-free. The practical effect of this as far as I can see is to make ebXML the only TPA-based B2B ecommerce framework that can be implemented without incurring a future obligation to pay royalties to IBM for the TPA component..." See: (1) OASIS ebXML Collaboration Protocol Profile and Agreement TC website; (2) "Electronic Business XML Initiative (ebXML)."
[November 25, 2002] "Web-Services Security Quality of Protection." Version 0.9. November 22, 2002. 25 pages. Text and reference posted 2002-11-22 to the OASIS 'wssqop-discuss' list by Tim Moses (Entrust) with the "Subject: QoP liaison statement". From the introductory 'Problem statement': "Using the WS Security Specification, service end-points have a standard means for securing SOAP messages using XML Signature and XML Encryption. However, the WSS specification does not provide a means for a Web-service consumer and provider to negotiate how the SOAP messages used in the exchange have to be protected in order to successfully invoke the service. In this paper, a technique for negotiating a mutually-acceptable security policy based on WSDL is proposed. The term security policy is used in this context to mean: 'a statement of the requirements for protecting arguments in a WS API', including: (1) how actors are to be authenticated, using what mechanisms and with what parameter value ranges, (2) which SOAP messages, elements and attachments are to be encrypted, for what individual recipients, recipient roles or keys, using what algorithms and key sizes, (3) which SOAP messages, elements and attachments are to be integrity protected, using what mechanisms, with which algorithms and key sizes and (4) what initial parameter values are used in the signature validation procedure, including what keys or authorities are trusted directly. This is a relatively restrictive use of the term 'security policy'. A more comprehensive definition addresses such requirements as: (1) privacy (retention period, intended usage, further disclosure), (2) authorization (additional qualifications the service consumer must demonstrate in order to successfully access the Web service) and (3) non-repudiation (requirements for notarization and time-stamping)... In some situations (notably RPC-style service invocation), Web service interactions are conducted in a persistent session. In such cases, the provider's security policy may indicate the requirements for authenticity, integrity and confidentiality of the session. In other situations (notably document-style service invocation), messages must be protected in isolation. In these cases, while the service provider may declare a policy, the service consumer actually creates the service request. So, the consumer actually chooses which security operations to apply to the messages. Therefore, we need a way for the consumer to discover the service provider's policy and choose a set of security operations that are consistent with both its own and the service-provider's security policies..." The note: "Colleagues: The WSS QoP Discussion Group has made significant progress since it was established as agreed at the first WSS Face-to-Face meeting. The results of its work can be found [online]... There is a significant body of opinion that the topic of QoP for Web-services is an important, and even urgent, one. The paper demonstrates that the problem is sufficiently well understood that a sub-committee of WSS could tackle it without impacting the schedule that has been established for the SOAP Message Protection specification. We therefore request that the WSS TC consider forming a sub-committee to address this work, and include an agenda item in its meeting of 3-December-2002 to discuss this request. [source]
[November 25, 2002] "Tip: Control White Space in an XSLT Style Sheet. Create the Document You Want by Understanding White Space Stripping." By Nicholas Chase (President, Chase and Chase Inc). From IBM developerWorks, XML zone. November 2002. ['Because the style sheet and the source document in an XSLT transformation have different rules regarding white space stripping, it often seems as though the production of spaces and line breaks has no rhyme or reason in the process. This tip shows you how to control the production of white space in a transformation's result, which can lead to documents that more closely align with your requirements. For this tip, you can use any XSLT processor, such as Xalan or Saxon, or a browser-based solution, such as Microsoft Internet Explorer or Mozilla.'] "Before processing a transformation, an XSLT processor analyzes the style sheet and the source document, and removes any applicable white space nodes. It then processes the document, building the result tree from the remaining nodes... When the processor strips the white space nodes from an element, it first checks to see if that element is on a list of white-space preserving elements. By default, all of the source nodes are added to this list, but you can remove one (or more) by adding them to the xsl:strip-space element. For example, you can strip all of the white space nodes out of the question element in order to compress any responses within the question or answer texts... Adding white space to the style sheet: When the processor strips the white space nodes from the style sheet, only one element is, by default, on the list of white space preserving elements: xsl:text. A text element is always preserved, so it can be useful for adding line breaks or spaces within a document... By understanding the rules for white space preservation as they apply to a style sheet or a source document, you can closely control the appearance of the final document, but white space will never be stripped from within a text node..." See also the series "Controlling Whitespace, by Bob DuCharme. For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."
[November 25, 2002] "Coca-Cola, UCCnet Revamping Supply Chain." By Ted Kemp. In InternetWeek (November 25, 2002). "UCCnet, a product registry and subsidiary of the Uniform Code Council standards organization, said this month that Coke will begin implementing UCCnet's services in 2003. The beverage giant, which claims to have the world's largest distribution system, said it will use internal IT resources to integrate with retailers using UCCnet, first in North America and then globally. The announcement is a serious endorsement of UCCnet. The world's largest beverage company, Coca-Cola sells drinks in almost 200 countries at a rate of more than 1 billion servings a day. The agreement marks the latest in a string of endorsements for UCCnet. The Uniform Code Council and EAN International, its international counterpart, agreed in October to make UCCnet the central product data registry for international commerce. That followed endorsements from the Grocery Manufacturers of America and Food Marketing Institute, both major trade organizations. In August, UCC merged with RosettaNet, a developer of standards for e-business transactions... UCCnet currently stores information on only about 25,000 products, up from 15,000 in April. UCCnet's Monaghan said an upload from Procter & Gamble comprised a good portion of the growth. But he added that the 25,000 figure comes mostly from companies signed in the early part of this year and doesn't include the large number of companies signed since..." General references in: "Uniform Code Council (UCC) XML Program."
[November 25, 2002] "Groove Ups .Net Support." By Dennis Callaghan. In eWEEK (November 25, 2002). "Groove Networks Inc. is expected to release version 2.5 of its Groove Workspace peer-to-peer collaboration software by the end of the year, offering improved support for .Net and Web services integration. This version will include a new .Net development tool, the Groove Toolkit for Visual Studio.net, which company officials say will make it easier and faster to build and deploy customer applications in Groove. Also featured in this release will be the Groove Web Services area, giving developers a new route to integrating Groove with other applications using SOAP. Customers will be able to build Groove-awareness into other applications -- so called contextual collaboration -- rather than building applications inside the Groove environment. Groove Workspace 2.5 will also feature improved integration with Microsoft Outlook, supporting Outlook calendars, meetings and contacts, and new integration with Microsoft SharePoint Team Services, which will allow Groove developers to synchronize SharePoint sites in a Groove space so that they can be used offline and extended outside the corporate firewall. Groove 2.5 will also support 'asymmetric files' --the ability to share files with other members of a shared space but to have control over whether these files are actually distributed immediately to all the other users. This allows users to pull down only the files they need to each machine. That feature is expected to be most useful for remote users sharing large document libraries, officials said..." See other details in "Extending Groove," by Jon Udell (InfoWorld).
[November 22, 2002] "Microsoft, SAS, Hyperion Release New XMLA Specification." By Dennis Callaghan. In eWEEK (November 21, 2002). "... XML for Analysis (XMLA) is a Simple Object Access Protocol (SOAP)-based XML API, designed to standardize the data access interaction between a client application and a data provider via the Web. It requires no client software, unlike current data access techniques, such as OLE DB and ODBC, making it hardware, operating system and programming language independent. Version 1.1 of XMLA defines two new XML-based data access methods: Discover and Execute. Discover is used to obtain information and metadata from a Web service. This information can include a list of available data sources and data about the provider for a particular data source. Properties are used to define and shape what data is obtained. Discover allows users to specify the data obtained in a common way without rewriting existing functions. Execute is used to execute multidimensional expressions (MDX) or other provider-specific commands against a particular XML for Analysis data source. The Discover and Execute methods enable users to determine what can be queried on a particular server and, based on this, submit commands to be executed. The XML for Analysis provider then retrieves the requested data, packages it into XML and sends it back to the client. Members of the XMLA Council hope the specification, as it develops into a standard, will accelerate the adoption of Internet business intelligence software and increase the market for those technologies. The XMLA Council also announced seven new members today: Crystal Decisions, INEA Corp., MIS AG, MJM Consultant Corp., Panorama Software Systems, SAP AG and Silvon Software Inc., giving the council a total of 27 members..." See details in the 2002-11-21 news item "XMLA Advisory Council Announces XML for Analysis Specification Version 1.1."
[November 22, 2002] "The Myths of 'Standard' Data Semantics. Faulty Assumptions Must Be Rooted Out." By William C. Burkett (Senior Information Engineer, PDIT). In XML Journal Volume 3, Issue 11 (November 2002). "Much of the literature heralding the benefits of XML has focused on its application as a medium for application interoperability. With (a) the Internet as a platform, (b) Web services as the functional building block components of an orchestrated application, and (c) XML as a common data format, applications will be able to communicate and collaborate seamlessly and transparently, without human intervention. All that's needed to make a reality is (d) for everyone to agree on and use XML tags the same way so that when an application sees a tag such as <firstName> it will know what it means. This intuitive understanding makes a lot of sense, which is why so many organizations have sprung into existence to create their own vocabularies (sets of tags) to serve as the 'lingua franca for data exchange in <insert your favorite industry, application, or domain>.' This intuitive understanding is so pervasive that it's even a key part of the U.S. GAO recommendations to Senator Joseph Leiberman (chairman of the Committee on Governmental Affairs, U.S. Senate) on the application of XML in the federal government. This report warns of the risk that: <q>...markup languages, data definitions, and data structures will proliferate. If organizations develop their systems using unique, nonstandard data definitions and structures, they will be unable to share their data externally without providing additional instructions to translate data structures from one organization and system to another, thus defeating one of XML's major benefits.</q>. The perspective of these efforts is that the standardization and promotion of the data element definitions and standard data vocabularies (SDV) will solve the application interoperability problem. Unfortunately, this intuitive understanding -- like many intuitive understandings -- doesn't survive the trials of real-life application because important (and seemingly trivial) assumptions are poorly conceived. This article will examine some of these assumptions and articulate several myths of 'standard' data semantics. The notion that data semantics can be standardized through the creation and promulgation of data element names/definitions or vocabularies is based on several assumptions that are actually myths:  Myth 1: Uniquely named data elements will enable, or are enough for, effective exchange of data semantics (i.e., information).  Myth 2: Uniquely named data elements will be used consistently by everybody to mean the same thing.  Myth 3: Uniquely named data elements can exist -- uniquely named as opposed to uniquely identified data elements. Many will readily acknowledge that these are, in fact, myths and that they don't really hold these assumptions. However, it seems that users of namespaces and developers of SDVs and metadata registries are pursuing their work as if these assumptions were true. No mechanisms or strategies have appeared in the extant literature that acknowledge, explain, or address the challenges that arise due to these faulty assumptions. The reasons that these assumptions are faulty fall into the following three areas of SDV development and use: (1) Scope, (2) Natural language use, and (3) Schema evolution... The purpose of this article hasn't been to argue that the problems and the challenges that face the SDV/registry development projects are unsolvable. Rather, it is to suggest that the solution vision must be more expansive. Faulty assumptions must be rooted out, and the problems that are thereby exposed must be explicitly and directly addressed. Despite their intuitive appeal, namespaces, SDVs, registries, and unique data element names will not solve the problem of interoperability. What's needed is the recognition that the semantics of a schema (or, more precisely, the semantics of data governed by a schema) must be explicitly bound to a known community that it serves, and that bridges between the communities will be an inevitable part of any comprehensive solution..." See: "XML and 'The Semantic Web'." [alt URL]
[November 22, 2002] "The 19th Annual Awards for Technical Excellence. 'Protocols' Winners: SAML and WS-Security." By the Editors of PC Magazine. In PC Magazine (November 19, 2002). "Securing Web services is no easy task. The same virtues that make Web services so promising for e-business -- they're platform-independent, text-based, and self-describing -- create major security concerns, giving pause to businesses considering a move to the hot new interoperability technology. Two standards are emerging to secure Web services: Security Assertion Markup Language (SAML) and WS-Security, both proposals submitted to the Organization for the Advancement of Structured Information Standards (OASIS). To protect confidentiality, WS-Security relies on XML Encryption, while SAML uses the slower HTTPS. WS-Security protects individual transactions, and the substantial infrastructure required by SAML pays off with single sign-on capability. The Liberty Alliance's authentication solution -- Liberty 1.0 -- builds on SAML, while Microsoft's competing technology, .NET Passport, uses WS-Security. No matter whether these two standards converge or remain separate, the success of Web services in e-business could depend on them..." See: (1) "Web Services Security Specification (WS-Security)"; (2) "Security Assertion Markup Language (SAML)."
[November 22, 2002] "Debug XSLT On The Fly: Debugging Tricks That Keep it Simple." By Uche Ogbuji (Principal Consultant, Fourthought, Inc). From IBM developerWorks, XML zone. November 2002. ['Debuggers are very handy in programming, but they can also be complex pieces of software in their own right -- complex to set up, to learn, and to use. Sometimes you just need a quick print-out of some values that you suspect to be at the heart of a specific problem you're seeing. In this article, Uche Ogbuji shows how to do quick debugging using xsl:message and other built-in facilities of XSLT, as well as common extensions in EXSLT.'] "Long before they turn to a debugger, most developers start with the equivalent of a print or printf statement in their programming language to try logging values that might shed some light on their code's errant behavior. XSLT is no exception. You can now get XSLT debuggers and even full integrated development environments (IDEs), but you probably won't always want to go through all the motions of firing up a debugger just to check a single value. This is especially true of those, like me, who use plain old XEmacs (with xslide, in my case) for editing XSLT transforms. xsl:message is usually the best solution for this. You can simply insert a message at the right spot in the right template as a quick debugging tool. The main downside is that there is no standard specification of xsl:message output: It can take the form of a pop-up dialog box, a log file, or -- for command-line XSLT processors -- a marked display to the standard error output. Usually, this is a minor matter; a quick trip to the documentation of any tool will tell you how it renders processor messages. Despite the potential differences in presentation, the nice thing about xsl:message is that it is a standard instruction and thus portable across processors. Here, I shall present several tips to make debugging with xsl:message even more effective..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."
[November 22, 2002] "W3C XML Schema Design Patterns: Avoiding Complexity." By Dare Obasanjo. From XML.com. November 20, 2002. ['A year or so ago XML.com published an article called "W3C XML Schema Made Simple," which suggested, somewhat controversially, that you should avoid areas of the W3C XML Schema specification in order to keep things manageable. This week our main feature is both a companion piece and counterpoint to the "Made Simple" article. In "W3C XML Schema Design Patterns: Avoiding Complexity" Dare Obasanjo suggests that most of W3C XML Schema is indeed useful, but highlights areas the schema author should handle with care.'] "Over the course of the past year, during which I've worked closely with W3C XML Schema (WXS), I've observed many schema authors struggle with various aspects of the language. Given the size and relative complexity of the WXS [W3C XML Schema] recommendation, it seems that many schema authors would be best served by understanding and utilizing an effective subset instead of attempting to comprehend all of its esoterica. There have been a few public attempts to define an effective subset of W3C XML Schema for general usage, most notable have been W3C XML Schema Made Simple by Kohsuke Kawaguchi and the X12 Reference Model for XML Design by the Accredited Standards Committee (ASC) X12. However, both documents are extremely conservative and advise against useful features of WXS without adequately describing the cost of doing so. This article is primarily a counterpoint to Kohsuke's and considers each of his original guidelines; the goal is to provide a set of solid guidelines about what you should do and shouldn't do when working with WXS... The Guidelines: I've altered some of Kohsuke's original guidelines [...] I propose some additional guidelines as well: Do favor key/keyref/unique over ID/IDREF for identity constraints; Do not use default or fixed values especially for types of xs:QName; Do not use type or group redefinition; Do use restriction and extension of simple types; Do use extension of complex types; Do carefully use restriction of complex types; Do carefully use abstract types; Do use elementFormDefault set to qualified and attributeFormDefault set to unqualified; Do use wildcards to provide well defined points of extensibility... The WXS recommendation is a complex specification because it attempts to solve complex problems. One can reduce its burdens by utilizing its simpler aspects. Schema authors should ensure that their schemas validate in multiple schema processors. Schemas are an important facilitator of interoperability. It's foolish to depend on the nuances of a specific implementation and inadvertently give up this interoperability..." General references in "XML Schemas."
[November 22, 2002] "XML Versus the Infoset." By Rich Salz. From XML.com November 20, 2002. ['Rich Salz discusses the conflict between the notion of XML as syntax and XML as a collection of information items--the Infoset. This seemingly abstract matter has concrete repurcussions when it comes to digital signatures of SOAP messages'] "Cryptography is all about manipulating bytes; it's impossible to sign an Infoset, which must be, at some point, concretely represented. SOAP 1.2 currently allows processors a great deal of flexibility about whitespace between elements in the SOAP header; after all, in the SOAP processing model they're irrelevant Infoset items... If an entity can modify this whitespace, it's impossible to reliably sign a SOAP header: you have to create a signature that signs each individual header element. Which isn't the same thing, because you then need to synthesize an additional signed datum in order to prevent someone from adding a new element which isn't signed. In all fairness, the SOAP WG is working to address this by pointing out the problems and resurrecting a SOAP Canonicalization scheme I circulated last year. But we should be concerned that this issue has shown itself only now, while the SOAP specification is in last call. I can only wonder what will happen the next time XML and its Infoset come into conflict..."
[November 22, 2002] "RPV: Triples Made Plain." By Kendall Grant Clark. From XML.com November 20, 2002. ['In the XML community over the past week, the debate over RDF rumbles on. As just about the oldest application of XML and Namespaces, the fact that RDF is still a source of debate four years later is testament to both its significance and its troubles. Kendall Grant Clark, custodian of the XML-Deviant column's watchful eye over the XML community, brings us the latest developments in the debate. Tim Bray has responded to the RDF discussion with a suggestion for a new RDF/XML syntax that aims to answer some of the criticisms of the existing format.'] "... If you don't like or understand or prefer RDF's XML serialization, find a way to avoid dealing with it directly. Using an RDF triplestore from a high-level language is one such way, while retaining some, perhaps all of the benefits of RDF's data model. So, my argument is a more focused variant of the suggestion Shelley Powers has been making repeatedly on XML-DEV lately: if you don't like or understand or prefer RDF, just don't use it. This seems fair enough. Most recent discussion of RDF, which has bubbled over the bounds of XML-DEV and moved out into the broader confines of the Web development community, has been by turns absurd and sublime. From foundational debates about whether RDF is complex, or fights over how to characterize its complexity, to awfully redundant discussions about whether its XML serialization is all that user-unfriendly, to meta-debates in which various sides jockey for position to see which side can be described as unfair or 'politically correct' (whatever that could possibly mean in this context) or dismissive or narrow-minded or high-handed -- and on and on. Yet the debate has also been productive at times, including Tim Bray's RPV proposal ['The RPV (Resource/Property/Value) Syntax for RDF']. Bray says his RPV proposal 'is an XML-based language for expressing RDF assertions ... designed to be entirely unambiguous and highly human-readable.' That two-part design goal is worth spending some time with insofar as it's emblematic of a good deal of the underlying debate over RDF. To say that an XML language is or should be 'entirely unambiguous' and 'highly human-readable' is to say that it should be as easily digestible by machines as by humans. It's that tension which runs all the way from XML to RDF. Further, Bray suggests that RDF has failed to gain traction because of this tension: his RPV proposal 'is motivated by a belief that RDF's problems are rooted at least in part in its syntax.' He elaborates on this point by saying, first, that RDF's XML serialization is 'scrambled and arcane,' preventing people from easily reading or writing it; second, that the XML serialization uses qualified names in a way that's not user-friendly and is in some conflict with the TAG's idea that significant resources be identified by URI; third, that there doesn't seem to be a general problem for metadata folks to think of things in terms of RDF's 3-tuples; fourth, that some alternatives to RDF-XML, like n3, suffer because, as non-XML, they can't get the network effect of ubiquitous XML support; and, fifth, that the idea of embedding RDF in XML languages, which seemed in the summer of 2000, both to Leigh Dodds and much of the rest of the XML development community, like a viable approach, 'has failed resoundingly in the marketplace'..." See "Resource Description Framework (RDF)."
[November 22, 2002] "The RPV (Resource/Property/Value) Syntax for RDF." By Tim Bray. November 2002. ['The RPV (Resource/Property/Value) syntax is an XML-based language for expressing RDF assertions. It is designed to be entirely unambiguous and highly human-readable.'] "The Resource Description Framework (RDF) is designed to facilitate the interchange and processing of metadata concerning Resources (where the word Resource is used in its Web sense). RDF models metadata as 3-tuples which assert that some resource has some property (identified by URI) which has a value identified either by URI or given literally. The centrality of metadata in many classes of application, and the simplicity and elegance of RDF's data model would seem to make it something that has many obvious applications on the Web. Despite this, RDF has been slow to catch hold. The RPV language proposal is motivated by a belief that RDF's problems are rooted at least in part in its syntax. Specifically: (1) The syntax of RDF/XML is sufficiently scrambled and arcane that it is neither human-writeable nor human-readable. (2) The RDF/XML syntax makes heavy use of qnames that is neither intuitive to humans nor conforms particularly well to Web Architecture, which requires that everything significant be identified by URI. (3) People who care about metadata have no trouble thinking in terms of resource/property/value triples. (4) Alternatives like N3 that make the RDF triples evident in syntax suffer in comparison to the XML/RDF syntax because they lack XML's widely-deployed base of software, i18n facilities, and APIs. (5) The notion that you RDF can be mixed into XML transparently enough to be unobtrusive has failed resoundingly in the marketplace..." See John Cowan's summary: "Mini-review of Tim Bray's RPV syntax for RDF. This is mostly a remapping of a small subset of standard RDF/XML. It has the following features that RDF/XML has not: (1) Expresses property names as URIs rather than QNames; (2) String-valued property can have the string stored remotely, reachable by URL; (3) Incorporates xml:base; (4) Provides separate base attributes for resources, properties, and values..." See "Resource Description Framework (RDF)."
[November 21, 2002] "Web Services Choreography Standard 18-24 Months Away." By Paul Krill. In InfoWorld (November 20, 2002). "Standardization of Web Services choreography, which will enable development of complex Web services for business, is 18 months to two years away, said a World Wide Web Consortium (W3C) official Wednesday while the W3C met to take up the issue. The complexity of the concept means it will take time to settle on a standard, said Sinisa Zimek, SAP advisory committee representative to W3C and a member of various W3C working groups. 'Choreography is a pretty broad area,' Zimek said 'It's much more complex to standardize choreographies than, for example, SOAP and WSDL.' Although WSDL and SOAP enable development of simple Web services such as stock quote notifications, more complicated Web services such as invoice processing will require choreography of processes, according to Zimek. The W3C this week held an event in Cambridge, MA... Upon conclusion of the advisory meeting, W3C members will continue until mid-December to provide input on a proposal to form a Web services choreography working group and afterward the director of the W3C, Tim Berners-Lee, will make a decision on whether to form such a panel, Zimek said. Members did concur that generally W3C should be the forum for standardizing Web services, according to Zimek. The panel is pondering the Sun-driven Web Services Choreography Interface (WSCI) proposal as a basis for Web services choreography and is waiting to hear from IBM and Microsoft on whether they will submit their rival plan, Business Process Extension Language for Web Services (BPEL4WS), to W3C for consideration as well, Zimek said. Zimek is an author of the WSCI specification..." See the draft W3C "Web Services Choreography Working Group Charter Proposal" posted by Jeff Mischkinsky on November 7, 2002 ["I have attached the proposed charter for a Choreography WG. This draft represents the WS Arch WG consensus recommendation that was taken on Nov 7, 2002..."] The discussion thread "Proposed Draft Charter for Choreography WG" provides additional background.
[November 21, 2002] "Raising the Bar on RSS Feed Quality." By Timothy Appnel. From the O'Reilly Web Services DevCenter. November 19, 2002. "RSS is an XML-based syntax for facilitating the exchange of information in a lightweight fashion through the distribution (or feeding) of resources. Publishers can use this versatile and increasingly essential format to assist end users in tracking and consuming content. Netscape originally developed the format but lost interest and eventually abandoned work on it. This created an identity crisis that devolved into varying interruptions, with dispute over even the meaning of the RSS acronym, RDF Site Summary or Rich Site Summary or Really Simple Syndication. But as divergent efforts work to develop RSS, one result has been a diminished overall quality in RSS feeds. In this article, I provide an overview of RSS's core syntax, then I examine the poor state of RSS feed quality and provide some recommendations for authoring more useful and effective feeds. This examination is not a review of the RSS specification, nor is it an emphatic plea for strict compliance. Instead, this article provides an approach to authoring RSS feeds that is neutral, practical, and conservative. RSS feeds are simply too useful a mechanism for information exchange services. It is imperative that we improve their effectiveness..." See "RDF Site Summary (RSS)."
[November 21, 2002] "Sun Presents XML Office Challenge." By [ComputerWire Staff]. In The Register (November 21, 2002). "Sun Microsystems Corp has floated a series of XML-based specifications designed to crack-open Microsoft Corp's Office monopoly and improve interoperability with StarOffice. Sun has lined-up partners to form a technical committee at the Organization for the Advancement of Structured Information Standards (OASIS) that will drive the proposed formats. Joining Sun are Corel Corp, XML publishing specialist Arbortext Inc, standards specialist Drake Certivo Inc and aircraft giant Boeing Corp among others... Sun said the OASIS Open Office XML Format Technical Committee's work would enable exchange of data in XML-based formats while retaining a 'high-level' of formatting between text, spreadsheets, charts and graphs... [Microsoft] Office's market share could come under pressure as customers, who are unhappy with Microsoft's latest licensing program, switch to low-cost offerings from rivals such as Sun and Corel. Microsoft, an OASIS member, said it sees 'no benefits' to joining as its customers will have 'great' XML support in its planned Office 11 product. Microsoft said the company supports XML Schema Datatypes (XSD) 1.0, and anything that the technical committee develops will work with Office 11. That version of the suite will use XSD 1.0. However, Joerg Heilig, Sun director of software engineering, said Sun's proposed formats do not use XSD 1.0. Heilig said they use 'standard' XML and existing standards such as Mathematical Mark-up Language (MathML) and Scalable Vector Graphics (SVG)..." See: "XML File Formats for Office Documents."
[November 21, 2002] "OASIS Calls for OpenOffice XML Specification." By Margaret Kane. In ZDNet News.com (November 21, 2002). ['An array of companies working on Web services specifications is calling for a new open-source standard to handle desktop application documents.'] "... The working group is trying to develop a standard data format for the creation of content such as text, spreadsheets and charts. The goal is to develop an interface between the office software and other applications using XML (Extensible Markup Language). 'Our goal is to achieve consensus on an open standard that will protect content -- whether it is an 800-page airplane specification or a legal contract -- from being locked into a proprietary file format,' Michael Brauer, a Sun employee and chair of the OASIS Open Office XML Format Technical Committee, said in a statement. Microsoft, which controls more than 90 percent of the desktop application software market through its Office products, has decided to take a 'wait and see' approach with the working group, said Simon Marks, product manager for Office. Microsoft is an OASIS member, and can join the working group at a future date, he said. 'If this turns out to be something that we feel (is necessary) for customers we can join, but currently we'll just wait and see,' he said..." See: "XML File Formats for Office Documents."
[November 20, 2002] "OASIS Aims to Create Office Document Standard." By Matt Berger. In InfoWorld (November 20, 2002). "A standards body known for creating key technologies around XML (Extensible Markup Language) said Wednesday that it has launched an effort to develop a standard file format that would allow office documents such as spreadsheets and word processing files to be opened by applications from different vendors... One of the goals of the group, called the Open Office XML Format Technical Committee, is to free corporate data from proprietary file formats so they can be accessed for years to come no matter what office software a company is using. Proponents contend that companies are currently saving data in proprietary file formats, such as those written in Microsoft's Word software, and locking themselves into using that software indefinitely. 'This solves a number of problems for enterprises,' said Simon Phipps, chief technology evangelist at Sun Microsystems, which is an initial member of the technical committee. 'It means that their data becomes machine readable without having to commit to a single vendor.' Corel Corp., which makes the word processing software Word Perfect, is also an initial member of the technical committee, and said it could benefit from such a standard. Other members include content management software maker Arbortext Inc. and The Boeing Co. Boeing has a stake in office document standards as it is bound by government regulations to create and archive an immense amount of data such as manuals. OpenOffice.org, the open source project that developed the office suite of the same name, has contributed its published list of XML-based office file formats to the group, with hopes that it will help provide the foundation for a standard. OpenOffice.org's software is sold by Sun as StarOffice... Creating an open office file format suggests that documents created in an application that supports that file format could be opened in other applications that support it as well. A document written using Corel Corp.'s Word Perfect, for example, could be opened in StarOffice without affecting the layout or formatting... Microsoft, which dominates the office software market with its Office suite, is a member of OASIS. Microsoft is aware of the technical committee but will not initially take part, a spokesman from a Microsoft outside public relations firm said in an e-mail message Wednesday. The company has announced recently that the next version of its Office suite, Office 11, will be heavily reliant on XML..." See: (1) the 2002-11-04 news item "OASIS Technical Committee for Open Office XML File Format"; (2) "OASIS Members Collaborate to Advance Open XML Format for Office Applications. Arbortext, Boeing, Corel, Drake Certivo, Sun Microsystems, and Others Develop Open Office Standard at Global Consortium."; (3) "XML File Formats for Office Documents."
[November 20, 2002] "SAML 1.0 Signals Next Step in Evolution of Web Services Security. Industry Analysts Provide Insight on SAML's Role Among Other Prominent Security Standards." By Matt Migliore, James Kobielus (Senior Analyst, Burton Group) and Ray Wagner (Gartner Inc). In Enterprise Systems (November 20, 2002). ['SAML is an XML-based framework for Web services that allows the exchange of authentication and authorization information among independent networks. Through it, enterprises can enable a number of Web-based security functions -- such as single sign-on and role-based access control (RBAC) -- across sites hosted by multiple companies. Furthermore, SAML provides security functionality for more complex Web services integration, whereby Web services have the intelligence to reach out to a number of other components to perform a given task. Prior to its official 1.0 release, SAML had been receiving significant support among the vendor community. However, on the user side, SSL (Secure Sockets Layers) was being implemented to secure Web services, and another emerging standard, WS-Security, was also gaining momentum. Now that SAML 1.0 has been approved, it's unclear how these standards and others, such as Public Key Infrastructure (PKI), will fit into the Web services equation. To help answer some of these questions, Security Strategies held a brief Q&A session with James Kobielus, a senior analyst with Burton Group, and Ray Wagner, a research director with Gartner Inc.'] Excerpts from Kobielus: "...As an open Web services security standard with broad vendor support, SAML 1.0 will stimulate use of Web services for external integration among organizations' line-of-business applications (such as ERP and procurement). SAML 1.0 is one of the most fundamental interoperability standards for Web services security. Even in advance of the standard's formal ratification by OASIS, SAML 1.0 had already gained broad acceptance, adoption, and implementation by many vendors of Web services security products, especially vendors of Web access management platforms, including vendors such as IBM, Sun, Netegrity, Oblix, Entrust, Entegrity, and Novell... SAML 1.0 will not necessarily reinvigorate interest in or implementation of PKI beyond PKI's limited role in today's Web services environment. PKI is primarily used today to enable server-side SSL, and SAML will primarily leverage server-side SSL for secure sessions among SAML-enabled Web security platforms. Consequently, PKI will play a role -- albeit limited to server-side SSL -- in SAML implementations. But SAML does not require new or enhanced PKI capabilities, such as client-side SSL or digitally signed SAML assertions. And it's very unlikely that SAML implementors will layer these additional PKI capabilities on SAML-based Web services security environments when server-side SSL is sufficiently secure for most real-world applications, internal or external to organizations... SAML uses server-side SSL to support encryption of content flowing over HTTP/S sessions between SAML-enabled servers that are doing Web SSO and RBAC. What SAML offers over and above SSL is a SOAP-based messaging protocol for Web SSO, plus XML-based data structures -- known as SAML assertions -- that are exchanged between SAML-enabled servers over this messaging protocol, plus implementation profiles describing how users can transparently access SAML-based Web security services through standard Web browsers..." See: (1) "Security Assertion Markup Language (SAML) Version 1.0 an OASIS Open Standard"; (2) general references in "Security Assertion Markup Language (SAML)."
[November 20, 2002] "Guidelines for the Use of XML within IETF Protocols." By Scott Hollenbeck (VeriSign, Inc.), Marshall T. Rose (Dover Beach Consulting, Inc), and Larry Masinter (Adobe Systems Incorporated) [WWW]. IETF Network Working Group, Internet-Draft. Reference: 'draft-hollenbeck-ietf-xml-guidelines-07.txt'. November 2, 2002, expires May 3, 2003. IETF Approved BCP. An IETF posting "Protocol Action: Guidelines for The Use of XML within IETF Protocols to BCP" indicates that this "Guidelines" document has been approved by The Internet Engineering Steering Group (IESG) as a "Best Current Practice (BCP)" publication; IESG contact persons are Ned Freed and Patrik Faltstrom. See background in the ietf-xml-use mailing list. "The Extensible Markup Language (XML) is a framework for structuring data. While it evolved from the Standard Generalized Markup Language (SGML) -- a markup language primarily focused on structuring documents -- XML has evolved to be a widely-used mechanism for representing structured data in protocol exchanges... Many Internet protocol designers are considering using XML and XML fragments within the context of existing and new Internet protocols. This document is intended as a guide to XML usage and as IETF policy for standards track documents. Experienced XML practitioners will likely already be familiar with the background material here, but the guidelines are intended to be appropriate for those readers as well... This document is intended to give guidelines for the use of XML content within a larger protocol. The goal is not to suggest that XML is the 'best' or 'preferred' way to represent data; rather, the goal is to lay out the context for the use of XML within a protocol once other factors point to XML as a possible data representation solution. The Common Name Resolution Protocol (CNRP) is an example of a protocol that would be addressed by these guidelines if it were being newly defined. This document does not address the use of protocols like SMTP or HTTP to send XML documents as ordinary email or web content. There are a number of protocol frameworks already in use or under development which focus entirely on 'XML protocol' -- the exclusive use of XML as the data representation in the protocol. For example, the World Wide Web Consortium (W3C) is developing an XML Protocol framework based on SOAP. The applicability of such protocols is not part of the scope of this document. In addition, there are higher-level representation frameworks, based on XML, that have been designed as carriers of certain classes of information; for example, the Resource Description Framework (RDF) is an XML-based representation for logical assertions. This document does not provide guidelines for the use of such frameworks..." Note: The IETF BCP subseries of the RFC series "is designed to be a way to standardize practices and the results of community deliberations. A BCP document is subject to the same basic set of procedures as standards track documents and thus is a vehicle by which the IETF community can define and ratify the community's best current thinking on a statement of principle or on what is believed to be the best way to perform some operations or IETF process function..." [cache]
[November 20, 2002] "OASIS at Work on Standard for Office Apps." By Peter Galli. In eWEEK (November 20, 2002). "The Organization for the Advancement of Structured Information Standards, or OASIS, has established a working group to create a standard data format for office applications that will improve data interoperability across those applications. OASIS, a not-for-profit consortium that drives the development and adoption of e-business standards, will announce on Wednesday the formation of the working group, known as the OASIS Open Office XML-Format Technical Committee. Sun Microsystems Inc.'s Michael Brauer will chair the committee, which includes representatives from Boeing, Corel Corp., Drake Certivo and Arbortext. Not on the initial list of initial technical committee members, however, is Microsoft Corp., although it is an OASIS member. Microsoft Office and Microsoft's other desktop office productivity applications account for more than 90 percent of that market. Simon Marks, a Microsoft office product manager, told eWEEK that the Redmond, Wash., company is not going to participate in the committee at this time... The technical committee will initially focus on standardizing data for content creation and then go on to simplifying data exchange between any XML application and office productivity applications. That will include business process automation, Web services, databases, search engines and other applications. Sun is also going to donate the XML file format specification utilized in the OpenOffice.org 1.0 project to the new OASIS technical committee as an input. 'The way these standards committees work is they take an initial input, which is then evolved. This file format is a suitable starting point as it's pure XML and fully specified by an open-source group,' Phipps said. An increasing number of companies are using proprietary formats and can only use a single-vendor platform for data. Accessing this often also uses brittle macro-languages built into the Web processing or Office tools themselves. 'They wanted a guarantee that their data would still be readable at an indefinite time in the future. So, with those three objectives in mind, the technical committee's intent is to create a standard data format that uses XML and which is fully specified so there are no surprises about what does what in the format,' Phipps said... While participation in the technical committee remains open to all organizations and individuals, OASIS said contributions will only be accepted if they are granted under perpetual, royalty-free, non-discriminatory terms. OASIS will also host an open mail-list for public comment, and completed work will be freely available to the public without licensing or other fees, the organization said..." See details in: (1) the 2002-11-04 news item "OASIS Technical Committee for Open Office XML File Format"; (2) "OASIS Members Collaborate to Advance Open XML Format for Office Applications. Arbortext, Boeing, Corel, Drake Certivo, Sun Microsystems, and Others Develop Open Office Standard at Global Consortium."; (3) "XML File Formats for Office Documents."
[November 20, 2002] "Location, Location, Location-Based Services. Location-Based Security for Wireless Apps." By Harsha Srivatsa (Independent software consultant). From IBM developerWorks, Wireless Security. November 2002. ['Studies by industry analysts forecast even greater demand for wireless and mobile devices, creating substantial opportunities for wireless device application and service providers. Faced with an increasingly difficult challenge in raising both average revenue per user (ARPU) and numbers of subscribers, wireless carriers and their partners are developing a host of new products, services, and business models based on data services. We'll have a look at location-based services and how they boost both service and revenue.'] "Providers are gearing themselves to offer differentiated value added services through location-based services and commerce. Location-based services should play a major role in the evolution of wireless data services as well as provide significant revenue to mobile operators and content providers. The anticipated growth of location-based services necessitates our addressing information security issues, particularly for those applications that access valuable and proprietary information and perform sensitive operations, such as financial transactions. even though specifying security assertions in XML may not make much sense for a limited bandwidth wireless network, the advantages far outweigh its bandwidth overhead. XML is also the communication data format of choice for the new generation of open, interoperable Web services applications for security services. SAML provides a much-needed interoperability between compliant Web access management and security products for wireless applications. Adding location information for authentication and authorization to the existing wireless security mechanisms is a value-added proposition for information assurance..."
[November 19, 2002] "Comdex: Open Mobile Alliance to Announce Eight Mobile Specs." By Ephraim Schwartz. In InfoWorld (November 19, 2002). "The OMA, Open Mobile Alliance, a cross-industry alliance between telecommunications and high tech, is expected to unveil a set of eight specifications for creating mobile applications and services. The OMA as a standard body governing the mobile industry is growing in importance as it adds new members and absorbs other mobile organizations. Over the last year, the OMA, which was founded by Nokia, Sun, Oracle, and others, admitted key rival Microsoft to its ranks and now totals 296 companies. 'You would be hard pressed to name any major company that is not in the OMA,' said Jacob Christfort, a vice chair on the OMA and CTO for Oracles Mobile Products and Services Division. The organization has also absorbed the WAP Forum, the Location Interoperabilty Forum, the MMS Interoperability Group, the SyncML Initiative, and the Wireless Village Initiative. The eight specifications that will be announced this week include the following: mobile browsing, Multimedia Messaging, Digital Rights Management, Domain Name Server lookup via mobile devices, mobile content download, e-mail push notification, and user/device profiles... One industry analyst called both the consolidation of groups and the release of specifications a boon to the industry giving developers a single set of standards. 'Up until now wireless middleware companies and application developers had to have membership in five different organizations and it was difficult to allocate resources,' said David Hayden, president of MobileWeek, in Palo Alto, Calif. The set of OMA standards if adhered to by all of the constituent players in the mobile industry will allow developers to basically write once and deploy an application across a plethora of devices including cell phones, handhelds, and converged devices. However, Hayden remains somewhat skeptical over the ability of the organization to house numerous rival companies. 'There could be some issues and a potential power struggle between Microsoft and Nokia. Microsoft wants to push mobile .Net services using XML and SOAP and Nokia has its platform. Both companies have different objectives,' said Hayden. However, OMA members appear to be ready to overcome their differences, said Christfort. 'A good specification leaves all of the parties just a bit unhappy,' he said..." References: (1) the 2002-11-19 announcement "Open Mobile Alliance Announces New Specifications, OMA Release Program, and Additional Industry Forum Consolidation"; (2) "Technical Specifications and Releases of the Open Mobile Alliance"; (3) "The SyncML Initiative"; (4) "WAP Wireless Markup Language Specification (WML)"; (5) "Proposed Open Mobile Alliance (OMA) Rights Expression Language Based Upon ODRL."
[November 19, 2002] "Web Identity: Weighing the Alternatives." By Carol Sliwa. In Computerworld (November 11, 2002). "In July , the Liberty Alliance Project released its specifications for a standards-based mechanism for simplified sign-on and user identity management... The second phase of the specifications -- which will include guidelines for site-to-site authentication and user-attribute sharing -- isn't due until the first half of next year, says Paul Madsen, a member of the Liberty Alliance's technology expert group and manager for identity services at Addison, Texas-based Entrust Inc. Microsoft's Passport authentication service, which has primarily targeted consumers, relies largely on proprietary protocols that the company made available last month for inspection and development through its shared source code licensing program. But Passport is expected to shift to authentication tokens based on MIT's Kerberos technology and add support for Web services standards next year. That, in turn, has given many in the industry hope that Passport may someday interoperate with Liberty-based authentication and identity management systems... Currently, the approaches differ. One major distinction is the location where each model stores and maintains user data. Another is the means by which the systems share a user's authentication status information. Under the Microsoft service, users register either via www.passport.com or a member site that has an agreement with Microsoft. The member site must be running Passport Manager software, which serves as an intermediary between the site's server and the Passport server and helps decrypt incoming cookies. When a user logs into a member site, he is redirected to a page with the Passport user interface and branding from the referring site. The member site can decide how many of 10 possible fields of information it wants the user to fill in, and the information is stored in Microsoft's Passport servers... When a user signs in at a participating site, he is redirected to Passport and, if he doesn't have a cookie that meets the referring site's policy, Passport prompts him for a name and password. An encrypted authentication ticket containing the user's information is sent from the secure Microsoft database to the client machine by way of a Web address query string. That ticket is then sent to the member site... The Liberty Alliance takes a different tack. It has no universal, unique user identifier that is recognized across sites, and no single identity provider that centrally stores user data. Instead, a wide range of sites can serve as identity providers, and these may federate with one another, exchanging authentication tokens via the Security Assertions Markup Language (SAML) and SAML extensions. Under a Liberty-based system, a user accessing a password-protected site is redirected to the appropriate identity provider. Once there, the user logs in and is redirected back to the original site with a one-time random string called an artifact. The artifact is then presented and exchanged for a SAML assertion, which contains the information the site needs to authenticate the user. In contrast, Microsoft now uses proprietary protocols to transmit authentication tickets between its Passport servers and member sites. Adam Sohn, a product manager in Microsoft's .Net strategy group, says that even when Microsoft adds support for Kerberos-based authentication next year, it will not be 'switch flipping' from the current Passport authentication mechanism to Kerberos-based authentication; it will be more gradual, because there are 200 million existing Passport accounts..."
[November 19, 2002] "Liberty Alliance Updates Net Identity Specification." By John Blau. In InfoWorld (November 19, 2002). "The Liberty Alliance Project has released version 1.1 of its open specifications for federated network identity as a maintenance update to version 1.0 released in July, the business and technology consortium said Tuesday in a statement. The version 1.1 document is the first to be issued by Liberty Alliance for public review, according to the consortium. Members of the Liberty Alliance, including Sun Microsystems Inc., are developing a technology to link various single sign-on authentication systems using a standard specification. The technology will provide an alternative to the Passport system developed by Microsoft Corp. The Liberty Alliance system works like this: When users log onto a Web site that supports the specification, they can visit other password-protected Web sites supported by the technology without having to sign in again. Several companies, including AOL Time Warner Inc., American Express Co., Cisco Systems Inc., eBay Inc., General Motors Corp. and Nokia Corp., have pledged to support the Liberty Alliance specification when it becomes available... Version 2.0, to be released in 2003 will provide an infrastructure for developing and supporting identity-enabled Web services from companies, organizations or government entities. The infrastructure will include a framework for permissions-based attribute-sharing and will allow groups of organizations, often referred to as 'circles of trust' or authentication domains, to be linked together, as opposed to operating as separate islands..." See details in the 2002-11-19 news item "Liberty Alliance Releases Draft Version 1.1 Specifications for Public Review."
[November 19, 2002] "Liberty Alliance Updates Identity Specification." By John Fontana. In Network World (November 19, 2002). "The Liberty Alliance Project on Tuesday updated its specification for creating a standard for network identity and solicited for the first time public comment on the document, signaling the consortium's intention to act more like a traditional standards body. The group released version 1.1 of the spec, which corrects a security flaw and clarifies ambiguities in the text of the draft. The 130-member group in July released the first draft, which details how to create a universal user identity to be used for authentication as a user moves from Web site to Web site. The effort is similar to Microsoft's Passport single sign-on consumer service, which it is trying to adapt for corporate use. [Michael Barrett, president of the Liberty Alliance] says the enhancements were made to bring the specification more in line with corporations that have set policies on managing identity credentials. In addition to changes to the specification itself, the Alliance also opened the document to general review by the public for the first time. Version 1.0 was only open to comments by members of the Alliance. 'We are trying to make the Alliance as open as possible while respecting the rights of our members,' Barrett says. The members, which include both user companies and vendors, pay a fee to participate in the group, which has been coy about whether it may at some point turn its work over to a recognized standards body or continue to work as a independent organization. But by opening the specification for public review, the Alliance seems to be signaling that it will continue to do its own work..." See: (1) details in the 2002-11-19 news item "Liberty Alliance Releases Draft Version 1.1 Specifications for Public Review" (2) general references; (3) topic pages for "Security, Privacy, and Personalization."
[November 19, 2002] "ASC X12 Reference Model For XML Design." From ASC X12C Communications and Controls Subcommittee. [And from] Data Interchange Standards Association, Inc. (DISA). Technical Report Type II. ASC X12C/2002-61. [X12, X12C - Communications & Controls.] October 2002. 112 pages. With Executive Summary. "This Reference Model was motivated by the action item that X12's Communications and Controls Subcommittee (X12C) took at the August 2001 XML Summit to develop "draft design rules for ASC X12 XML Business Document development". Acting on that action item, X12C's EDI Architecture Task Group (X12C/TG3) determined that XML design rules could not be developed without a basis for determining which XML features to use and how to use them. Thus the group also set about developing a philosophical foundation and putting forth some general design principles. This Reference Model covers those topics in addition to a preliminary set of design rules. The approach discussed herein is a work in progress. It is intended to be the foundation for X12's future XML development, and will become the basis for XML equivalents to the X12 syntax based X12.6 and X12.59, and XML Design Rules. It is consistent with the decisions of X12's Steering Committee to develop its XML work within the ebXML framework..." The purpose of the document is to specify an approach to eBusiness messaging that: (1) links implementation with the standards, (2) enables cross industry differentiation, and 3) supports industry needed 'quick' solutions. The scope of the document includes: a granularity model, an architecture, meta data for storing architecture components and XML Syntax design with approaches to implementing XML syntax. The intended audience of this document is the X12 committee and others who are interested in collaborating with X12 in developing XML schemas for business documents. However, the broader initiative is aimed at a much larger audience. The X12 XML initiative is targeted at every sector of the business community, from international conglomerates to small and medium sized enterprises (SMEs) engaged in business-to-business (B2B), business-toconsumer (B2C) and application-to-application (A2A) trade. With that audience in mind, the X12 XML initiative is committed to developing and delivering products that will be used by all trading partners interested in maximizing XML interoperability within and across trading partner communities..." See also the 3-page CICA Executive Summary ['Context Inspired Component Architecture']. A related posting from David Barkley (ASC X12 Chair) to Margaret Pemberton and Mark Crawford identified the ASC X12 Reference Model for XML Design as a "formal contribution to the UN/CEFACT ATG XML message design efforts." See "ANSI ASC X12/XML and DISA."
[November 19, 2002] "Mixed Messages. Organizations Can Manage Both EDI and XML to Get the Best of Both Worlds." By John Moore. In Federal Computer Week (November 18, 2002). For a decade or more, some agencies have been using EDI, typically the X12 protocol, which is the primary North American standard for defining EDI transactions. It's steady, secure and gets the job done whether processing mortgage insurance claims or ordering supplies from contractors. EDI has become a familiar communication tool in numerous information technology shops. Enter Extensible Markup Language. The specification provides a way to create Web documents that can be easily shared among organizations. In the business-messaging context, XML stands out for simplicity. Any organization with a Web server and an XML parser, which processes XML messages in much the same way a Web browser reads HTML documents, possesses the rudiments of business-to-business or business-to-government e-commerce. Some observers once believed that Web-powered XML would sweep EDI out of the business-to-business picture. Yet the overthrow of EDI has yet to take place. Indeed, industry watchers believe EDI and XML will coexist for years to come... Those who decide to juggle EDI and XML have a couple of choices. They can build and maintain separate EDI and XML messaging infrastructures, a rather expensive proposition. But, the better solution, observers suggest, is to pursue an integration strategy that allows EDI and XML messages to be transmitted over the same network. Technology solutions on the market are designed to do just that... XML's structure provides another edge. XML messages include embedded metadata, according to Jake Freivald, director of marketing at iWay Software, a business integration middleware vendor. The metadata provides business context, which cryptic EDI messages lack. Therefore, XML messages are less prone to errors in interpretation, he said. The promise of XML has left its mark on the EDI crowd. A recent survey of X12 users reveals that 56 percent are deploying XML, while 34 percent are evaluating the technology... EDI's ability to handle higher message volume is another reason to keep it around. EDI's compressed nature contributes to its complexity but is an asset in high- transaction environments. A messaging rule of thumb is that an XML message can be as much as 10 times larger than its EDI counterpart... messaging experts expect many organizations to pursue a coexistence strategy. Jeff Eck, global product manager with Global Exchange Services, said customers would employ both XML and EDI, depending on what business problem needs to be solved. Some organizations already are heading down that path. Freddie Mac uses a number of X12 EDI messages for loan servicing but is building XML messages for real estate finance. The XML messages will help automate loan-underwriting decisions... Global Exchange Services, for example, provides data transformation software as part of its trading partner network and as a stand-alone product. The software, called an application integrator, supports both XML and EDI data formats. It works with value-added networks, but also supports protocols for direct buyer/seller connectivity via the Internet, Eck said. Organizations with mature EDI programs and an interest in XML can use data transformation software to take advantage of existing resources, Eck said. Customers, however, don't always take advantage of this option. Using an established value-added network for EDI and XML offers some advantages. First, there's the simplicity of adding a new messaging format to an existing network infrastructure. They also offer stability in terms of private-network security and time-tested reliability mechanisms, such as those confirming message receipt. Cost is another issue. Achieving EDI/XML coexistence via enterprise application integration software can cost from $300,000 to $500,000..." See also the sidebars "EDI Standards Group Adopts XML" and "Bulletproofing XML."
[November 19, 2002] "XML For the Rest of Us. [Microsoft Office 11 and XML.]" By Jon Udell. In InfoWorld Issue 46 (November 18, 2002), pages 18-19. ['In Office 11, Word and Excel can display, edit, and save XML documents. Using XML Schema definitions bound to these documents, enterprise architects can for the first time ensure that users of common desktop applications will create and maintain high-quality, integration-ready data... In a dramatic breakthrough, Office 11's XML features target end-users with no knowledge of XML. Users of Word and Excel will be most productive when supported by developers who can fluently define data models, using XML Schema, and write XML transformations, using XSLT.'] "The first public beta of Microsoft Office 11 demonstrates, as promised, that XML has become a native Office file format. What's more, Word 11 and Excel 11 can associate documents with data definitions written in XML Schema, and they can interactively validate documents against schemas. These are transforming achievements. Previous Office upgrades have been yawners, but version 11 should rivet the attention of IT planners. We've known for many years that most of our vital information lives in documents, not databases. XML was supposed to help us capture the implicit structure of ordinary business documents (memos, expense reports) and make it explicit. Sets of such documents would then form a kind of virtual database. The cost to search, correlate, and recombine the XML-ized data would fall dramatically, and its value would soar. It was a great idea, but until the tools used to create memos and expense reports became deeply XML-aware, it was stillborn. XML did, of course, thrive in another and equally important way. It became the exchange format of enterprise databases and the lingua franca of Web services. Now Office 11 wants to erase the differences between XML documents written and read by people using desktop applications, and XML documents produced and consumed by databases and Web services. This is a really big deal. The first beta of Office 11 doesn't include any demonstrations of the new XML features, but the Office team put together some examples for us, and Jean Paoli talked us through them... Once valid, the document can be saved as XML in two ways. The default is to create WordML, which preserves Word's styles and formatting in an XML name-space that's separate from the one bound to the schema-controlled data. You can optionally save through an XSLT transformation which, in a publish-to-the-Web scenario, could translate WordML formatting into HTML/CSS formatting. Alternatively, if you tick the Save as Data option, you can instead save just the raw XML data. In that case, you can bind one or more XSLT stylesheets to the document, each of which can generate WordML styles and formatting. The XML expertise needed to create schemas and XSLT transformations is scarce today. Once Office 11 hits the streets, its mainstream applications could arguably commoditize those XML skills more quickly and broadly than have Web services technologies. What's more, Office is positioned as a bridge between the worlds of desktop applications and Web services. In the emerging architecture of the business Web, XML-wrapped remote procedure calls are giving way to XML documents. SOAP, we'll soon see, isn't just a way for services to talk to one another. A purchase order acquired from a Web service by means of a SOAP call will sometimes need to be modified by a person. The application used to edit that purchase order will have to be a familiar tool. It will also have to guarantee that the document it passes along contains well-structured, valid, and thus enterprise-ready data. Office 11 appears to meet both of these requirements. And it does so in ways that respect the inherent strengths of the applications..." See "Microsoft Office 11 and XDocs."
[November 19, 2002] "Developer's Introduction to JAX-RPC, Part 1: Learn the ins and outs of the JAX-RPC type-mapping system. Type mapping lays the foundation for interoperable Web services." By Joshy Joseph (Software Engineer, IBM OGSA Development Team). From IBM developerWorks, Web services. November 2002. ['The Java APIs for XML-Based Remote Procedure Call (JAX-RPC) are an important step forward in the quest for Web services interoperability. In this first of two articles, Joshy Joseph takes you to the heart of that interoperability effort: the JAX-RPC type-mapping system. You'll learn how XML types are translated into Java types to ensure a smooth exchange of data between Web service clients and Java-based applications.'] "Java APIs for XML-based Remote Procedure Call (JAX-RPC) have reached the final recommendation stage at the Java Community Process as JSR 101. XML Web services vendors have started using this package as a core API for building interoperable Web services on the Java platform. In this series, I'll provide a step-by-step tour through the major features provided by this standard, using sample code to guide you along the way. By the end of this series, you will be familiar with the core features of the JAX-RPC specification; this knowledge will help service, client, and toolset developers design Web services that are as interoperable as possible. For this discussion, I will assume that you are familiar with basic Web service concepts such as SOAP, WSDL, and XML. The code samples you'll encounter were all developed using Apache Axis beta 3 and Sun's Web Services Developer Pack -- in other words, they were developed using the reference implementation for the JAX-RPC specification. This article is the first in a two-part series. I'll begin the discussion by tackling one of the most important aspects of JAX-RPC: the type-mapping system. This system enables the run-time system to map each XML type defined in a WSDL document to its corresponding Java type as specified by the Java service interface, and vice versa. This is a major step towards interoperable Web services. JAX-RPC specifies extensible type mapping support for an extended set of XML and Java types. The JAX-RPC run-time system implements a serialization framework to support this type mapping..." See: (1) Java API for XML-based RPC (JAX-RPC) [Sun] and (2) JSR 101: Java APIs for XML based RPC.
[November 19, 2002] "Scoping Out Web Services Problems." By Darryl K. Taft. In eWEEK (November 19, 2002). "Mindreef LLC on Tuesday announced at the Software Development East 2002 show, in Boston, the availability of Mindreef SOAPscope Personal 1.0, a new Web services diagnostics system for developers. SOAPscope Personal 1.0 features a logger and viewer for finding and troubleshooting problems in such areas as performance and interoperability, said Jim Moskun, a co-founder of Mindreef, based in Hollis, N.H. Moskun said the Mindreef product collects information about Simple Object Access Protocol (SOAP) transactions and uses it to shed light on Web services communications. And while most logging tools store data in a flat file, SOAPscope Personal stores it in a relational database for easier use, he said. SOAPscope will run in sniffer mode on a network or as a proxy, the company said. It also supports psuedocode, has Modify/Resend capability for trying new values and re-sending SOAP packets when errors occur and features debugging annotations for putting debug output in the log, Moskun said. "There's a lot of activity in Web services management, but we felt there weren't any tools targeted at problems people have today," Moskun said. "A lot of companies are ahead of the curve, building large infrastructures to handle Web services management, but people are still in the early adopter phase. Things like interoperability between toolkits is a problem, and getting from point A to point B. If a message doesn't get through, you need to find out what went wrong." Moskun said Mindreef joined SOAPbuilders, an informal group of grassroots Web services implementers -- including Sun Microsystems Inc., IBM Corp., IONA Technologies Inc, BEA Systems Inc. and others -- who gather quarterly to test the interoperability of their Web services platforms. He said the group used SOAPscope as one of the tools in the tests..." See the SOAPbuilders discussion list and description [from Sun Microsystems]. See also the announcement of 2002-11-19: "Mindreef Announces Availability of SOAPscope Personal 1.0 at SD East. Industry's First Web Services Diagnostics System Helps Solve Common Web Services Problems." General references in "Simple Object Access Protocol (SOAP)."
[November 18, 2002] "SAML 1.0 is Adopted - What Developers Can Expect." By Vance McCarthy. From Integration Developer News (November 18, 2002). ['Earlier this month, more than 200 security and web services vendors approved the Security Assertion Markup Language (SAML) v. 1.0. According to OASIS execs and member companies, this unanimous endorsement set in motion the first step toward providing a common security framework for a variety of vendor-specific web services solutions. SAML 1.0 spells out the specs and the schema for web services identity management and single sign-on, but last week's vote doesn't fill in the gaps on how hundreds of vendors will implement the standard or even test for compliance and vendor-to-vendor interoperability.'] "... By itself, SAML won't sort out all the scuffles, but it is quite central to many aspects of the security picture, including XACML (Extensible Access Control), WS-Security, The Liberty Alliance Project, (XML Key Management Specification), biometrics and provisioning... So far, the peace talks between Liberty and Microsoft Passport have made progress, thanks in large part to OASIS and to the willingness of both sides to adopt some XML-based security-brokering technology specified by SAML. Sun is committed to SAML on a number of fronts in its Liberty Alliance Project and the progress of Java and J2EE. In Liberty, SAML will support one-to-one trusted relationships in which users will be able to link certain core accounts to a single username/password. In addition, the Java Community Process (JCP) has a proposal to natively support SAML (JSR 155) for use in J2EE..." See "Security Assertion Markup Language (SAML) Version 1.0 an OASIS Open Standard."
[November 18, 2002] "Enosys Puts Liquid in Liquid Data." By Carolyn A. April. In InfoWorld (November 18, 2002). "When BEA made a big splash with its Liquid Data announcement earlier this month, not much was said about the secret technology sauce underpinning the integration initiative. But this week, the key ingredient is under wraps no more. Startup Enosys Software, a self-described enterprise information integration (EII) company, revealed this week that its suite of XML- and XQuery-based products is the driving force behind Liquid Data's promise of real-time, transparent melding of data from multiple back-end sources to a single external target. Under the terms of their technology agreement, BEA will OEM components of Enosys' product suite, including the core XQuery Engine, XQuery Builder front-end design tool and metadata repository, to build into its WebLogic application server platform. This lineup will allow users to search for and aggregate data located in any format across silos of sources in the enterprise, according to Raj Pai, vice president of product marketing for Enosys, based in Redwood Shores, Calif... BEA sees Liquid Data as providing visibility to front-end applications such as those in customer service or portals, for example. The EII concept is gaining steam with a number of large vendors, namely IBM with its Xperanto initiative and Microsoft with its forthcoming Yukon release of SQL Server, as a way to unify reams of data floating around the enterprise. The field is also crowded with pure plays such as Enosys, as well as MetaMatrix, Nimble Technology, and others..." See: (1) the announcement "Enosys Powers New Class Of Enterprise Information Integration Applications. Selection of Enosys as BEA Systems Partner Endorses XQuery. New XQuery Products from Microsoft, IBM and Oracle Mark Growing Demand for XQuery-based EII."; (2) general references in "XML and Query Languages."
[November 18, 2002] "IBM Builds Mobile Web Services Toolkit." By [ComputerWire Staff]. In The Register (November 18, 2002). "IBM has created a wireless web services toolkit that uses open source implementations of popular XML-based standards for devices with a small footprint... The company's Web Services Tool Kit for Mobile Devices, posted to IBM's alphaWorks web site, uses kSOAP and kXML for Java web services and gSOAP for C-based web services. These open source implementations of key web services standards omit certain functionality like array handling, to suit the needs of mobile devices that have limited memory and battery life. Mobile devices are increasingly seen as vital to web services, because they potentially provide ubiquitous access to information and services. The challenge for developers, though, it is to get past familiar limitations thrown-up by devices' small form factors. Redmond, Washington-based Microsoft Corp is offering its own approach to this problem. The company is producing a cut-down version of its PC and server-centric .NET Framework for Windows-based handsets that is called the .NET Compact Framework. Steve Holbrook, IBM program director of emerging e-business standards, said IBM's Web Services Toolkit for Mobile Devices has added appeal to developers, though, because its provides a single-point-of-contact to develop for multiple platforms, not just Windows. Unlike the desktop market, the market for mobile operating systems is diverse..." See details in the 2002-11-18 news item "IBM alphaWorks Releases Web Services Toolkit for Mobile Devices."
[November 15, 2002] "Get Ahead With Java Web Services: Get Up to Speed on the Java Web Services Developer Pack." By James McCarthy (President, Symmetry Solutions, Inc). From IBM developerWorks, Web services. November 2002. ['Java developers who are interested in getting started with Web services should check out the Java Web Services Developers Pack (WSDP). In this article, James McCarthy takes you on a quick tour of this package. You'll learn what the tools in this package can do for you, and find out which components are just for testing and which are ready for production use as-is.'] "With the recent release of version 1.0 of the Java Web Services Developers Pack (Java WSDP) from Sun Microsystems, Java developers now have a convenient all-in-one download to assist in the development of Web services on the Java platform. The Java WSDP includes all of the Java APIs for XML (JAX) that are in the Java XML Pack, along with the Apache Tomcat server and other components needed to provide a fully functional environment for developing and testing Web services... The Java WSDP is not a product, but rather a reference implementation of Web services standards in a convenient, easy-to-install package. The package is comprised of a combination of production-ready implementations, along with several components that should only be used for testing purposes. As a result, the Java WSDP is not meant to be an environment for deploying production applications, but rather for developing and testing Web services; it's mainly intended to help the Java developer get started with Web services. The Java WSDP is an excellent tool for understanding, developing, and testing Web services; and, since it is based upon open standards, you won't need to start over when you move into a production environment... When you install the Java WSDP distribution, you will create a directory where all of the components are located. By default, this directory contains a fully functional server environment ready to develop and test Web services. The main components that support Web services are also contained in the Java XML Pack, which includes all of the current production Java APIs for XML (JAX). The following is a list of the Java XML Pack components, with a brief description of their function: (1) Java API for XML Processing (JAXP) is a pluggable API that is open to any vendor's implementation of the W3C's recommended XML APIs -- that is, SAX, DOM, and XSLT. (2) Java API for XML Messaging (JAXM) is designed to enable applications to send and receive document-oriented XML messages using a pure Java API; JAXM includes a messaging provider that is the reference implementation of version 1.0 of the ebXML Transport, Routing, and Packaging specification. (3) SOAP with Attachments API for Java (SAAJ) is a package that enables developers to produce and consume messages that comply with the SOAP 1.1 specification, including SOAP attachments. (4) Java API for XML-based RPC (JAX-RPC) is the implementation package for supporting SOAP 1.1 XML-based RPC calls. (5) Java API for XML Registries (JAXR) supports XML registries -- now commonly used for storing information about published [UDDI] Web services; the JAXR API provides a uniform way to access this information..." See also the accompanying FAQ document and Java WSDP tutorial.
[November 15, 2002] "Sun Boosts Enterprise Java. The Upcoming J2EE 1.4 Release Supports New XML and Web Services Semantics." By B. J. Fesq. In Java World (November 15, 2002). ['In this article, B.J. Fesq provides a clear understanding of the enterprise Java platform's direction and introduces J2EE (Java 2 Platform, Enterprise Edition) 1.4's support for emerging Web services standards.'] "Sun Microsystems is expected to finalize the J2EE 1.4 specification this fall, with highly anticipated vendor implementations hitting the market in first quarter 2003. Core technologies, including servlets, JSP (JavaServer Pages), and EJB (Enterprise JavaBeans), are being revised; however, most changes focus on integrating extensive XML support throughout the platform. The various specification revisions include a few hidden gems, like EJB 2.1's new container-managed timer service. But the most high-profile changes involve XML support and new Web services semantics within the J2EE client and server environments... Whether you're a project manager, Web application developer, or a hard-core middleware and distributed applications guru, this article provides a clear understanding of the enterprise Java platform's direction and how J2EE 1.4 will support emerging Web services standards... Sun introduced the Java APIs for XML, or Java XML Pack, at JavaOne in June 2000, reintroduced them as part of the Web Services Developer Pack (WSDP) in summer 2001, and will include many of these APIs in the J2EE 1.4 platform this winter. The Java XML Pack enriches the Java platform with a wide spectrum of XML support. A brief recap of a Web service's lifecycle serves as a good template for understanding where each of the various Java APIs for XML fits into the J2EE platform. Web services must provide their service's definition, usually in the form of a WSDL (Web Services Description Language) document. They also must advertise their availability in a registry such as UDDI (Universal Description, Discovery, and Integration) or ebXML, both of which are evolving through the nonprofit e-business consortium OASIS (Organization for the Advancement of Structured Information). Ultimately, Web services require runtime binding and RPC (remote procedure call)-style interaction, both synchronous and asynchronous. This involves the exchange, processing, and potential transformation of XML documents. The Java API for XML-based RPC (JAX-RPC) is the backbone of J2EE 1.4's Web services support, offering a fairly complete Java-to-SOAP (Simple Object Access Protocol) abstraction with the RMI (Remote Method Invocation) programming model's familiar semantics. JAX-RPC also includes tools for generating a WSDL document from a Java Web services interface and vice versa. The Java API for XML Registries (JAXR) provides a pluggable abstraction for Web services registry operation, lookup, and interaction. Finally, the Java API for XML Processing (JAXP) is the glue that binds all things XML, giving J2EE portable APIs for XML processing with support for pluggable parsers and transformers..." See Java Technology and XML.
[November 15, 2002] "ebXML Adoption Update." From the OASIS ebXML Awareness Team. November 2002. 17 pages. An update reporting on ebXML implementations. "ebXML (Electronic Business using eXtensible Markup Language), sponsored by UN/CEFACT and OASIS, is a modular suite of specifications that enables enterprises of any size and in any geographical location to conduct business over the Internet. Using ebXML, companies now have a standard method to exchange business messages, conduct trading relationships, communicate data in common terms and define and register business processes." Executive Summary: "The OASIS ebXML Awareness Team has brought together the information in this document in collaboration with OASIS member organizations and UN/CEFACT partners to give a wide-ranging picture of the adoption of ebXML today. The original technical specification work on ebXML was completed and made public in May of 2001, and subsequent work is now continuing between OASIS and UN/CEFACT jointly on further developing and enhancing the original specifications. Major vendors have developed ebXML support into their flagship products along with new vendors who are providing unique ebXML capabilities and implementations of the specifications. Key industry sectors and government entities are deploying new eBusiness applications that are serving as benchmarks in the adoption of ebXML globally. Our aim with this document is to show decision makers how ebXML is developing today, and how they too can gain critical advantage for their businesses and partners by implementing ebXML both for their established systems and also to solve previously difficult business needs through the technology that ebXML provides. No truly successful technology today is an island, so what is also important is to see how ebXML is adapting, evolving and growing through deployment into real world situations. Over 2,000 people contributed to the original ebXML development efforts, and now those continuing efforts are being augmented by projects showing how ebXML is being used in tandem with other technologies such as web services and XML as a whole. Seeing how industry groups and standards bodies are working to bring their vocabularies into alignment around ebXML is also the beginnings of a new level of interoperability that the industry as a whole has been seeking for more than a decade. The projects and implementations described here are truly groundbreaking and exciting in how they are delivering on the promise of ebXML and advancing and improving eBusiness worldwide..." See "Electronic Business XML Initiative (ebXML)." [cache]
[November 15, 2002] "Normalizing XML, Part 1." By Will Provost. From XML.com. November 13, 2002. ['Will Provost's Schema Clinic series on XML.com has so far taken an object-oriented view of W3C XML Schema design. This month, Will has written the first of a two-part series that examines the relational aspects of schema design. The series examines guidelines that achieve the goal of normalization -- the principles guiding database design -- using the mechanisms provided by W3C XML Schema.'] "The goal is to see what relational concepts we can usefully apply to XML. Can the normal forms that guide database design be applied meaningfully to XML document design? Note that we're not talking about mapping relational data to XML. Instead, we assume that XML is the native language for data expression, and attempt to apply the concepts of normalization to schema design. The discussion is organized loosely around the progression of normal forms, first to fifth. As we'll see, these forms won't apply precisely to XML, but we can adhere to the law's spirit, if not its letter. It's possible to develop guidelines for designing W3C XML Schema (WXS) that achieve the goals of normalization: (1) Eliminate ambiguity in data expression; (2) Minimize redundancy -- some would say, 'eliminate all redundancy'; (3) Facilitate preservation of data consistency; (4) Allow for rational maintenance of data. In this first of two parts, we'll consider the first through third normal forms, and observe that while there are important differences between the XML and relational models, much of the thinking that commonly goes into RDB design can be applied to WXS design as well. ... the key concept of reducing redundancy through key association is alive and well in W3C XML Schema design. While I'd love to finish on this bright note, I must report that there are devils inhabiting the details. In part two of this article, I'll point them out and discuss the implications for WXS design, as well as addressing the subtler fourth and fifth normal forms..." See: (1) See: "XML and Databases"; (2) "XML Schemas."
[November 15, 2002] "RDF, What's It Good For?" By Kendall Grant Clark. From XML.com. November 13, 2002. ['Kendall Clark's been keeping an eye on the XML-DEV debate and the subject of the spec everyone loves to hate, RDF. Bob DuCharme and John Cowan's recent XML.com article on RDF spawned discussion about what RDF was actually good for.'] "The Resource Description Framework is still among the most interesting of W3C technologies. But it's got persistent troubles, including having had its reputation beaten up unfairly as a result of the many and often nasty fights about RSS. But, just like my eccentric old uncle, RDF is not entirely blameless. In a previous XML-Deviant article ('Go Tell It On the Mountain') I argued that RDF's trouble might have something to do with it having been the victim of poor technical evangelism. In some sense that's still true. Recently I googled for a comprehensive, up-to-date RDF tutorial, which proved as elusive as finding Uncle's dentures the morning after nickel beer night at the bingo hall. In fact, I was hard pressed to find an RDF tutorial which looked like it had been updated this year. And one which I did find simply listed 13 different ways to express the same basic set of assertions, which not only makes a terrible tutorial, but also exemplifies another of RDF's persistent troubles: its XML serialization... During the time I've tracked RDF in the XML community, I can't recall running across even one enthusiastic defender of RDF's XML serialization. Apparently everyone, or so it seems, thinks it's a nasty kludge at best. Now, I've been using RDF in some of my recent Python programming, using Daniel Krech's excellent rdflib -- which, as Andrew Kuchling reminded me, thanks to its new ZODB/ZEO storage layer, now does fully-distributed storage. One virtue of rdflib is that it shields me, the carefree application hacker, from having to deal with RDF's XML serialization. I never think about it or about its warts. I rarely even see it. Which is perfect. As long as, when I send the XML-serialized dump of my RDF triple store to someone else, they end up with the same graph of assertions, I'm happy. But everyone's needs are not as easy to satisfy..." See "Resource Description Framework (RDF)" and "RDF Site Summary (RSS)."
[November 15, 2002] "Proper XML Output in Python." By Uche Ogbuji. From XML.com. November 13, 2002. ['One of the first issues a newcomer to XML discovers is that of encoding characters: even if you're just using ASCII you bump up against the need to escape characters like '<' and '&'. The problems become worse when you're handling documents in Unicode or other encodings. Uche Ogbuji's Python and XML column this week addresses just these problems. Uche dismisses the notion that writing XML is as simple as a 'print' statement, and provides guidelines that are applicable both for Python programmers and anyone dealing programmatically with XML.'] "... First, I consider ways of producing XML output in Python, which might make you wonder what's wrong with good old print... Indeed, programmers often use simple print statements in order to generate XML. But this approach is not without hazards, and it's good to be aware of them. It's even better to learn about tools that can help you avoid the hazards... The main problem with simple print is that it knows nothing about the syntactic restrictions in XML standards. As long as you can trust all sources of text to be rendered as proper XML, you can constrain the output as appropriate; but it's very easy to run into subtle problems which even experts may miss. XML has been praised partly because, by setting down some important syntactic rules, it eases the path to interoperability across languages, tools, and platforms. When these rules are ignored, XML loses much of its advantage. Unfortunately, developers often produce XML carelessly, resulting in broken XML. The RSS community is a good example. RSS uses XML (and, in some variants, RDF) in order to standardize syntax, but many RSS feeds produce malformed XML. Since some of these feeds are popular, the result has been a spate of RSS processors that are so forgiving, they will even accept broken XML. Which is a pity. Eric van der Vlist -- as reported in his article for XML.com, 'Cataloging XML Vocabularies' -- found that a significant number of Web documents with XML namespaces are not well-formed, including XHTML documents. Even a tech-savvy outfit like Wired has had problems developing systems that reliably put out well-formed XML. My point is that there's no reason why Python developers shouldn't be good citizens in producing well-formed XML output..." See: (1) Python and XML, by Christopher A. Jones, Fred L. Drake, Jr.; (2) "XML and Python."
[November 15, 2002] "W3C Promotes Royalty-Free Web Services Standards." By Paul Krill. In InfoWorld (November 14, 2002). "The World Wide Web Consortium (WC3), which is working on standardization of Web services technologies, on Thursday published a 'Last Call Working Draft' of its proposed Royalty-Free Patent Policy, which is intended to enable W3C technology recommendations to be implemented on a royalty-free basis. To achieve the goal of royalty-free specifications, the proposal stipulates that participants in development of W3C Recommendations must agree to license 'essential claims,' defined as patents that block interoperability, on a royalty-free basis, W3C said. The W3C's proposed policy does not require relinquishing an entire patent portfolio, just patent claims essential to implement a standard that a patent holder participates in developing at W3C, according to the W3C statement. The issue of royalty-free technologies has come up recently in W3C deliberations on development of the SOAP 1.2 specification and on Web services choreography, which is expected to be the subject of a new W3C working group. Sun Microsystems, which has submitted its Web Services Choreography Interface (WSCI) proposal on Web services choreography to the W3C, supports royalty-free standards, said Susy Struble, Sun manager of XML industry initiatives... Microsoft and IBM also have proposed a standard on Web services choreography, called Business Process Execution Language for Web Services (BPEL4WS). It has not yet been submitted for consideration by a standards organization. Both Microsoft and IBM released prepared statements upon inquiries about the companies' royalty policies pertaining to W3C choreography standardization efforts. [...] The W3C Patent Policy Working Group, which issued the Last Call document, will produce a final draft proposal for consideration of the W3C membership and the public... Additionally, a gathering called the WC3 Advisory Committee Forum is to be held in Boston next week, at which time W3C members are expected to discuss the proposed formation of a Web services choreography working group, according to W3C..." See details in the 2002-11-15 news item "W3C Patent Policy Working Group Issues Last Call Royalty-Free Patent Policy Working Draft"; general references in "Patents and Open Standards."
[November 15, 2002] "New Oracle App Server To Bolster Mobile Support, Toplink Integration." By Barbara Darrow. In CRN (November 13, 2002). "The next release of Oracle 9i Application Server will offer better integration of business-to-business and Web services capabilities. Oracle 9i Application Server Release 2 Version 9.0.4 will include a 'full-fledged integration broker and business process monitor,' said Thomas Kurian, senior vice president of application servers for Oracle, Redwood Shores, Calif. Additionally the product will build in more wireless and voice enhancements with J2ME support. Java 2 Micro Edition is a version of Java for PDAs and other devices. The new app server will also support the emerging xHTML Web design language and Xforms, technology aimed at easing creation of pages that display well on many screen sizes. The new Web server will also bring Toplink capabilities into its own stack, according to Oracle's web site. Oracle acquired Toplink, a tool that maps between Enterprise Java Beans (EJBs) and databases, from Webgain last June... In a sign of closer ties to Hewlett-Packard, Oracle also said the current version fo the application server will be bundled with HP systems running HP-UX as well as Proliant servers..."
[November 15, 2002] "On XML Objects." By Martin Kempa and Volker Linnemann (Institut für Informationssysteme, Universität zu Lübeck, Germany). Paper presented at the PLAN-X Workshop on Programming Language Technologies for XML (October 3, 2002, Pittsburgh, PA, USA). 11 pages, with 27 references. "In today's web applications we face the problem that there is the world of HTML and XML on one side and the world of objects (primarily Java objects) on the other side. Programs generating XML and HTML, for example Java Servlets, either have to generate and analyze XML on a string-basis which is rather tedious or have to generate object structures in the document object model or in JAXB. This requires to switch between XML strings and corresponding objects manually by programming. Moreover, the object structure is not guaranteed to conform to an underlying document type definition DTD or XML schema. Although the goal of JAXB is to guarantee this validity, it is only achieved up to a certain extent. In many cases, expensive runtime testing of validity is necessary by using a validation method provided by JAXB. Moreover, in JAXB XML strings and XML objects are two different things requiring to switch between these two notions by methods called marshalling and unmarshalling. In this paper we propose that in object oriented programming with XML there should be no distinction between XML documents and XML objects. In other words, XML in an object oriented program always denotes XML objects, i.e., generating and analyzing XML is done conceptually only on the basis of objects. We propose, similarly to JAXB, to have a class for every element type of a DTD or an XML schema. In contrast to JAXB, these classes are defined such that the generation of XML objects is done in a syntax oriented manner allowing to check the validity of all generated XML structures, i.e., XML objects, statically by the compiler. We believe that by eliminating the difference between XML objects and XML documents and by introducing absolutely type safe tools for generating XML objects, programming of web applications, i.e. Java Servlets, is much easier, much safer and much less error-prone..." [cache]
[November 15, 2002] "An Algorithm for Streaming XPath Processing with Forward and Backward Axes." By Charles Barton, Philippe Charles, Marcus Fontoura, and Deepak Goyal (IBM T.J. Watson Research Center); Vanja Josifovski and Mukund Raghavachari (IBM Almaden Research Center). Paper presented at the PLAN-X Workshop on Programming Language Technologies for XML (October 3, 2002, Pittsburgh, PA, USA). 10 pages, with 15 references. "We present a novel streaming algorithm for evaluating XPath expressions that use backward axes (parent and ancestor) and forward axes in a single document-order traversal of an XML document. Other streaming XPath processors, such as YFilter, XTrie, and TurboXPath handle only forward axes. We show through experiments that our algorithm significantly outperforms (by more than a factor of two) a traditional non-streaming XPath engine. Furthermore, since our algorithm only retains relevant portions of the input document in memory, it scales better than traditional XPath engines. It can process large documents; we have successfully tested documents over 1GB in size. On the other hand, the traditional XPath engine degrades considerably in performance for documents over 100 MB in size and fails to complete for documents of size over 200 MB... Our experiments reveal that significant performance benefits can be obtained by using the XAOS algorithm for evaluating XPath expressions on XML documents in a streaming fashion. We are working on extending the XAOS engine to handle more of XPath, building on the framework we have described in this paper..." [cache]
[November 15, 2002] "Static Analysis for Dynamic XML." By Aske Simon Christensen, Anders Møller, and Michael I. Schwartzbach (BRICS, Department of Computer Science, University of Aarhus, Denmark). Paper presented at the PLAN-X Workshop on Programming Language Technologies for XML (October 3, 2002, Pittsburgh, PA, USA). 12 pages, with 11 references. "We describe the summary graph lattice for dataflow analysis of programs that dynamically construct XML documents. Summary graphs have successfully been used to provide static guarantees in the JWIG language for programming interactive Web services. In particular, the JWIG compiler is able to check validity of dynamically generated XHTML documents and to type check dynamic form data. In this paper we present summary graphs and indicate their applicability for various scenarios. We also show that the expressive power of summary graphs is similar to that of the regular expression types from XDuce, but that the extra structure in summary graphs makes them more suitable for certain program analyses." [cache]
[November 14, 2002] "Web Services Development: Jean Paoli on XML in Office 11." By Jon Udell. In InfoWorld (November 14, 2002). ['Next week's issue of InfoWorld includes an article on the new XML capabilities of Office 11. While researching the story, I interviewed the architect of XML in Office 11, Microsoft's Jean Paoli, one of the primary co-creators of XML. Here are some of his remarks == excerpts from Paoli'] "... The goal is to unleash the Excel functionality on generic schema, on customer-defined schema. Who knows how to create a data model better than the financial or health care company who uses the data? Until now, it was very difficult to find a tool which lets you pour the data belonging to any arbitrary schema, and then, for example, chart that data... All our tools are XML editors now: Word, Excel, XDocs. But we shouldn't think about XML editors, we should think about the task at hand. If I want to create documents with a lot of text, that's Word. With XDocs, the task is to gather information in structured form. And with Excel, it's to analyze information. We have this great toolbox which enables you to analyze data. We can do pie charts, pivot tables, I don't know how many years of development of functionality for analyzing data. So we said, now we are going to feed Excel all the XML files that you can find in nature... To create the schema for your spreadsheet, first look at the information which is captured in that spreadsheet. Give names to the data. The data is about the user's name and e-mail address, for example. I don't want to call it cell 1, cell 2, or F1 or F11. The whole thing about XML is to give names to things which are in general not named... The goal is to unleash the Excel functionality on generic schema, on customer-defined schema. Who knows how to create a data model better than the financial or health care company who uses the data? Until now, it was very difficult to find a tool which lets you pour the data belonging to any arbitrary schema, and then, for example, chart that data..." Udell says: "Modeling XML data using DTD (Document Type Description) or, more recently, XML Schema, has been a fairly arcane discipline. Practitioners have included publishers seeking to repurpose content and Web services developers writing WSDL files for which XML Schema serves as the type definition language. But enterprise data managers have not, in general, seen much reason to model lots of data using XML Schema. With Office 11, Microsoft aims to rewrite the rules in a dramatic way. If every enterprise desktop can consume, process, and emit schema-valid XML data, the modeling of that data becomes a huge strategic opportunity. And the people who can do that modeling effectively become very valuable..." See "Microsoft 'XDocs' Office Product Supports Custom-Defined XML Schemas"; general references in "XML Schemas."
[November 13, 2002] "Business Process Standard Moves Forward." By Richard Karpinski. In InternetWeek.com (November 13, 2002). "A new XML standard for automating business process management was released as a final draft Wednesday, setting the stage for the addition of standards-based workflow capabilities to enterprise servers and applications. Business Processing Modeling Language (BPML) 1.0 was released as a final draft by the Business Process Management Initiative. The group also released the first public draft of the Business Processing Modeling Notation (BPMN 0.9), which provides standards-based graphical interface that can be used to describe business processes. BPML is not the only such would-be standard making the rounds. Business Process Execution Language for Web Services (BPEL4WS) is being forwarded by IBM, Microsoft, BEA, and others. IBM is making BPEL4WS support a major feature in the upcoming release of WebSphere 5.0. The BPMI said BPML 1.0 is interoperable with BPEL4WS. Both XML-based approaches to business process management provide enterprises with the ability to define business process workflows in a standard way and provide a standard way for other applications to access and be a part of the process flow. BPML 1.0 leverages the Web Service Choreography Interface (WSCI) for the definition of public process interfaces, and is designed to support the emerging WS-Security, WS-Transaction and WS-Coordination specifications for the execution of collaborative business processes... See details in the 2002-11-13 news item "BPMI.org Publishes BPML 1.0 and Business Process Modeling Notation (BPMN) Working Draft"; general references in "Business Process Modeling Language (BPML)."
[November 12, 2002] "XML Forms Specification Approved by W3C." By John Fontana. In Network World (November 12, 2002). "The World Wide Web Consortium on Tuesday [2002-11-12] gave standards approval to a technology for designing Web-based forms used to collect, input and extract native XML data from enterprise systems and business applications... Pperhaps the most important development is that corporations will now have a way to capture data in native XML format, which can then be stored and shared among any systems that supports XML. 'All of a sudden it creates the one bit of functionality that's been missing in the XML-based Web, which is interacting with XML documents. It's inputting and editing data natively in XML,' says Steven Pemberton, co-chairman of the XForms Working Group and a researcher at CWI, the Dutch national research institute. Forms-based development has been the backbone of electronic commerce for collecting data and executing transactions, but even with the advent of XHTML, forms technology remains labor intensive and somewhat rigid. XForms is the evolution of the XHTML effort that adds ease of development and flexibility to deployment. It will become part of the XHTML 2.0 specification, slated for completion sometime next year. But XForms also supports other markup languages, including Scalable Vector Graphics.. Microsoft last month introduced a product called XDocs, which is vaguely defined as a forms application for inputting XML data, but Microsoft has not said if it will support XForms. 'There are already 17 implementations available, which shows a strong demand for what is being offered by XForms,' Pemberton says. Novell, for one, released on Tuesday a technology preview for integrating XForms with its Extend platform and tools. The platform's Web Application Server, Composer tool for data integration and Director for creating portal interfaces will all support XForms..." See: (1) "Novell to Provide Visual Application Development Tools Based on Emerging XForms Standard."; (2) "W3C XForms 1.0 Advances to Candidate Recommendation Status"; (3) "XML and Forms."
[November 12, 2002] "Using P3P to Negotiate Access Rights to User Profiles " By Wolfgang Woerndl (Technische Universität München, Munich, Germany). A Position Paper prepared for the W3C Workshop on the Future of P3P (November 12-13, 2002). "This position paper demonstrates the application of P3P and APPEL in decentral management of user profiles. After a short introduction, our ideas for privacy preserving identity management are summarized... Commercial systems such as Microsoft .NET Passport or the Open Source Liberty Alliance Project are already being used or are under development. However, these applications presently lack strong privacy mechanisms. Users need to control access to their personal data. Access control based on privacy policies and preferences is an integral part of our project to decentralize user profile management... In our project Cobricks ('Bricks for community support systems'), we are exploring ideas for federated user profile management, especially to support (virtual) communities. We are interested in interoperability among systems and nevertheless preserving the privacy of personal information. In our scenario, a service agent requests user profile information from a user profile agent and the system needs to determine whether access should be granted or not. Therefore, an access control system based on the purpose and context of data accesses is needed. The proposed access control system for user profiles consists of two phases: (1) Negotiation of access rights using privacy policies and preferences, and generation of an Access Ticket; (2) Data access with the Access Ticket. The negotiation of access rights is based on P3P and APPEL. An user profile agent evaluates the access request and the P3P policy of the service with user preferences. These user preferences are APPEL rules with some extensions to faciliate access control principles such as access modes (e.g., 'read' or 'write'). If the user profile agent cannot reach a decision, user interaction may be necessary. The result of this semi-automatic negotiation process is an Access Ticket (AT). The Access Ticket is a XML document that manifests the access rights of a certain service to the user profile information. The AT is digitally signed by the user profile agent or ID Repository on behalf of the user and must be presented by the service with each data access... The Access Tickets are similar to other XML based access control approaches such as XML Access Control Markup Language (XACML), but tailored for user profil data access. In our project and possibly related work such as the Liberty Alliance Project or eXtensible Name Service (XNS), P3P and APPEL are used (or could be used) to determine access rights to personal information..." A more detailed presentation is given in "Community Support and Identity Management." See: (1) the W3C Platform for Privacy Preferences (P3P) Project, and (2) general references in "Platform for Privacy Preferences (P3P) Project."
[November 12, 2002] "Why is P3P Not a PET?" By Ruchika Agrawal (Electronic Privacy Information Center - EPIC). A Position Paper prepared for the W3C Workshop on the Future of P3P (November 12-13, 2002). "This paper identifies a broad definition and necessary requirements of privacy-enhancing technologies (PETs), provides examples of effective PETs, questions why P3P does not satisfy the definition of PETs, and finally, raises other concerns about P3P. Privacy-enhancing technologies are protocols, standards, and tools that directly assist in protecting privacy, minimizing the collection of personally identifiable information, and when possible, eliminating the collection of personally identifiable information... Blind signatures are an extension of digital signatures. Digital signatures simply ensure authentication, while blind signatures ensure authentication of individuals without identification. One-way functions provide the mathematical foundation for blind signatures, ensuring that the identity of the individual signer cannot be computed in a reasonable amount of time. One application employing blind signatures is the use of "digital cash", which is analogous to the use of hard cash in that it cannot identify the spender while the service provider is assured of the transaction's authenticity. Blind signatures serve as a good example of an effective PET, since blind signatures eliminate the collection of personally identifiable information... P3P fails as a privacy-enhancing mechanism because P3P does not aim at protecting personal identity, does not aim at minimizing the collection of personally identifiable information, and is on a completely different trajectory than the one prescribed by the definition of PETs. P3P provides no genuine privacy protection: instead of being used to minimize the collection of personally identifiable information, P3P can easily be used to obtain data from consumers by facilitating the collection of personal information through the guise of notice and choice..." General references in "Platform for Privacy Preferences (P3P) Project."
[November 12, 2002] "Securely Available Credentials Protocol." Edited by Stephen Farrell (Baltimore Technologies). IETF Internet-Draft. Reference: 'draft-ietf-sacred-protocol-bss-04.txt'. November 2002. Appendix A: XML Schema. "This document describes a protocol the secure upload and download of cryptographic credentials. Discussion of this draft is taking place on the SACRED mailing list of the IETF SACRED working group. We describe a protocol whereby a user can acquire cryptographic credentials (e.g., private keys, PKCS #15 ['PKCS #15 v1.1: Cryptographic Token Information Syntax Standard'] structures) from a credential server, using a workstation that has locally trusted software installed, but with no user-specific configuration. This is somewhat less secure than a smart card, but can be used until smart cards and smart card readers on workstations become ubiquitous, and can be useful even after smart cards are ubiquitous, as a backup strategy when a user's smart card is lost or malfunctioning. The protocol's payloads are described in XML. This memo also specifies a BEEP [The Blocks Extensible Exchange Protocol Core] profile of the protocol. The protocol sets out to meet the requirements in 'Securely Available Credentials - Requirements'. In particular, security requirements are met by mandating support for TLS and/or DIGEST-MD5. The approach taken here is to define SACRED elements that are compatible with the elements used in XML Key Management Specification (XKMS 2.0) and XML-Signature Syntax and Processing, so that an implementation of this protocol can easily also support XKMS, and vice versa. It is also intended that other SACRED protocol instances (e.g., using a different authentication scheme, credential format or transport protocol) could re-use many of the definitions here..." See also: (1) the Credential Server Framework and (2) the IETF Securely Available Credentials Working Group. See "Security Standards." [cache]
[November 12, 2002] "Take Advantage of Existing External XML Schemas with a Custom Import Framework in ASP.NET." By Scott Short. In MSDN Magazine Volume 17, Number 12 (December 2002). ['Over the years, many industry-standard XML schemas and dialects have been developed. These industry-specific schemas embrace the original purpose of XML and are extremely valuable in promoting and supporting B2B interaction. Unfortunately, the ASP.NET Web Services runtime does not allow developers to directly reference external schemas from within their XML Web Services interface (the WSDL file). This article builds an external schema framework as an extension to the ASP.NET Web Services runtime to enable you to reference external schemas within your XML Web Service interface.'] "... Many industries have collaboratively developed XML schemas that define industry-specific concepts. Among others, the travel industry, the hospitality industry, and the education industry have published schemas. If your XML Web Service is intended for a particular industry, it makes sense to leverage targeted schemas. You can also use task-oriented XML dialects. Some examples include the Astronomical Markup Language (AIML), Robotic Markup Language (RoboML), and the Speech Application Language Tags (SALT). The more an XML Web Service adheres to recognized standards, the higher the probability that it will be consumed by others. In most cases, leveraging an industry standard involves referencing external schemas within the WSDL document of your XML Web Service. Unfortunately, the initial version of the ASP.NET Web Services platform does not provide out-of-the-box support for referencing external schemas. One way to overcome this limitation would be to write your WSDL file by hand and ensure that your XML Web Service supports the interface defined by the custom WSDL file. Doing so, however, makes development and maintenance more complex. First, it is pretty easy for the implementation and the interface to become out of sync. Second, not only does the original developer need to have a strong knowledge of WSDL, but any developer who maintains the code will have to as well. Another way to reference external schemas from your XML Web Service is to extend the ASP.NET Web Services platform, which provides a rich interception model that allows developers to add this kind of extended functionality. I used this interception model to create what I call an external schema framework (ESF)..." See: "Web Services Description Language (WSDL)."
[November 12, 2002] "Sending Files, Attachments, and SOAP Messages Via Direct Internet Message Encapsulation." By Jeannine Hall Gailey. In MSDN Magazine Volume 17, Number 12 (December 2002). ['Direct Internet Message Encapsulation (DIME) is a new specification for sending and receiving SOAP messages along with additional attachments, like binary files, XML fragments, and even other SOAP messages, using standard transport protocols like HTTP. In this article, the author explains what DIME is and how it differs from MIME encapsulation. A detailed description of the message format and how it is parsed, as well as working with SOAP and extending it with WSDL, is also included.'] "... DIME allows you to send attachments of various types along with your SOAP message, even when the attachments in question do not fit conveniently or efficiently into an XML format. DIME is designed to be a fast and efficient protocol to parse. The length and type of attached data are defined in a few simple header fields. The protocol is kept lean and mean by the assumption that any additional message metadata will be included as part of a SOAP message, since SOAP is already such a rich metadata-based protocol. Although designed to work with SOAP, the use of DIME is not strictly limited to SOAP, and it may prove useful whenever a simple, efficient message encapsulation is required... DIME has the potential to become a very useful encapsulation method for attachments to SOAP messages by utilizing the rich metadata in SOAP against a simple, efficient encapsulation mechanism. In addition to its technical merits, DIME is receiving the full support of Microsoft going forward, as is indicated by its inclusion in the newest version of the Microsoft SOAP Toolkit. As with any early specification-based technology, you can expect some changes as the technology matures..." See: "Direct Internet Message Encapsulation (DIME)."
[November 12, 2002] "Place XML Message Design Ahead of Schema Planning to Improve Web Service Interoperability." By Yasser Shohoud. In MSDN Magazine Volume 17, Number 12 (December 2002). ['Web Services are all about exchanging data in the form of XML messages. If you were about to design a database schema, you probably wouldn't let your tool do it for you. You'd hand-tool it yourself to ensure maximum efficiency. In this article, the author maintains that designing a Web Service should be no different. You should know what kind of data will be returned by Web Service requests and use the structure of that data to design the most efficient message format. Here you'll learn how to make that determination and how to build your Web Service around the message structure.'] "When you build a data-centric application, how do you create the database schema? Do you begin by creating classes and then let your IDE or tools create the database schema for you, or do you design the database schema yourself, taking into account normalization, referential integrity, and performance optimizations? Chances are you design and create the database schema yourself. Even if you use a visual schema designer rather than data definition language (DDL) statements, you are still taking control of the database schema design. Web Services are all about supplying the right data at the right time. When a client calls a Web Service, an XML data message is sent over the wire and a response is returned to the client. When you program the Web Service and its clients, you are really programming against these messages. The data in these messages is ultimately what the application cares about. So why would you create a Web Service beginning with the classes and methods and let the tools create the message schemas for you? You should design the data (message) schema and implement the Web Service to fit this design, like you would when designing a database schema... Web Services are all about applications exchanging data over the Web in the form of XML messages, so building a Web Service requires careful design of these messages using XML Schema and WSDL. When you begin with message design rather than method design the kind of data your Web Service expects to receive and return is made clear. By designing messages using XSD and WSDL, you create a formal interface definition that Web Service developers can implement and client developers can program against simultaneously. Next time you begin a Web Service project, begin by designing the messages format using the Visual Studio XML Schema designer..." For schema description and references, see "XML Schemas."
[November 11, 2002] "Integrate Enterprise Applications with Web Services and J2EE. Combine These Enterprise Technologies to Ease EAI." By Daniela Rudrof (IT Specialist, IBM Security Operations) and Andre Tost (Solution Architect, IBM WebSphere Business Development Group). From IBM developerWorks, Web services. November 2002. ['In this article, Andre Tost and Daniela Rudrof offer a vision of how J2EE and Web services can work together to ease enterprise application integration (EAI). You'll see how the Java Messaging Service and the Java 2 Connector Architecture can be used in tandem with Web services technologies to bring the integration process to a new level of abstraction.'] "The Java 2 Platform, Enterprise Edition (J2EE) addresses the need for making existing applications and business processes available on the Web in a robust, secure, and transactional way. Several specifications -- most notably the Java Messaging Service (JMS) and the Java 2 Connector Architecture (JCA) -- have been established under the J2EE umbrella to focus on integrating J2EE applications with non-J2EE environments. In addition, Web services technologies have recently caught a lot of attention in the integration arena by defining common ways for applications to interact with each other across heterogeneous programming languages and operating systems. This is made possible because Web services use XML as the base for their data formats, be it for the description of a particular service (that is, the Web Services Definition Language, or WSDL), or for the actual invocation of a service (that is, the Simple Object Access Protocol, or SOAP). In this article, you'll learn how to take advantage of J2EE integration technologies -- specifically the JMS and JCA standards -- and enhance them with Web services technologies in order to implement enterprise application integration in a more standards-based and interoperable way. We will show how a common Web services-based interface can help you integrate a back-end system into a J2EE environment. By doing so, you'll support a higher level of automation and the use of various tooling environments, thus making it easier to connect back-end systems to each other without needing to worry too much about individual APIs and protocols... Web services technology introduces the notion of a service-oriented architecture to a business system. Business functionality is represented by abstract definitions based on XML, as described in WSDL documents. We can integrate J2EE and Web services technologies by defining specific protocol bindings for JMS and JCA in WSDL; this allows us to define back-end interfaces in a common, protocol-independent way. At runtime, the Web Services Invocation Framework handles the generation of service invocations regardless of protocol, which means that an application developer doesn't have to manage multiple programming interfaces to access business functions..."
[November 11, 2002] "Web Ontology Language (OWL) Abstract Syntax and Semantics." W3C Working Draft 8-November-2002. Edited by Peter F. Patel-Schneider (Bell Labs Research), Patrick Hayes (IHMC, University of West Florida), Ian Horrocks (Department of Computer Science, University of Manchester), Frank van Harmelen (Department of Artificial Intelligence, Vrije Universiteit Amsterdam). Produced by the W3C Web Ontology Working Group. Latest version URL: http://www.w3.org/TR/owl-semantics/. "The OWL Web Ontology Language is being designed by the W3C Web Ontology Working Group as a revision of the DAML+OIL web ontology language. This description of OWL contains a high-level abstract syntax for both OWL and OWL Lite, a subset of OWL. A model-theoretic semantics is given to provide a formal meaning for OWL ontologies (or knowledge bases) written in the abstract syntax. A model-theoretic semantics in the form of an extension to the RDFS model theory is also given to provide a formal meaning for OWL ontologies written as n-triples. A mapping from the abstract syntax to n-triples is given and the two model theories are shown to have the same consequences on OWL ontologies that can be written in the abstract syntax... This document contains several interrelated specifications of the several styles of OWL. First, Section 2 contains a high-level, abstract syntax for both OWL Lite, a subset of OWL, and a fuller style of using OWL, sometimes called OWL/DL. This document, however, defines neither a presentation syntax nor an exchange syntax for OWL. The official exchange syntax for OWL is RDF/XML; a document defining how RDF is used to encode OWL is the subject of the OWL Reference document. A mapping from the abstract syntax to n-triples is provided..." See recently "W3C Publishes Guide to the Web Ontology Language (OWL)"; general references in "OWL Web Ontology Language."
[November 11, 2002] "Ontology Building: A Survey of Editing Tools." By Michael Denny. From XML.com. November 06, 2002. ['Earlier this year at the WWW2002 conference, there was a surprisingly strong interest in ontologies--structured models of known facts. Ontologies have come out of the research labs and into common use for modeling complex information. Our main feature this week is a survey of tools available for editing ontologies. As part of his survey Michael Denny also provides a great introduction to what ontologies are, how they vary, and how they are constructed.'] "The semantic structuring achieved by ontologies differs from the superficial composition and formatting of information (as data) afforded by relational and XML databases. With databases virtually all of the semantic content has to be captured in the application logic. Ontologies, however, are often able to provide an objective specification of domain information by representing a consensual agreement on the concepts and relations characterizing the way knowledge in that domain is expressed. This specification can be the first step in building semantically-aware information systems to support diverse enterprise, government, and personal activities...In the Semantic Web vision, unambiguous sense in a dialog among remote applications or agents can be achieved through shared reference to the ontologies available on the network, albeit an always changing combination of upper level and domain ontologies. We just have to assume that each ontology is consensual and congruent with the other shared ontologies (e.g., ontologies routinely include one another). The result is a common domain of discourse that can be interpreted further by rules of inference and application logic. Note that ontologies put no constraints on publishing (possibly contradictory) information on the Web, only on its (possible) interpretations... The wide array of information residing on the Web has given ontology use an impetus, and ontology languages increasingly rely on W3C technologies like RDF Schema as a language layer, XML Schema for data typing, and RDF to assert data... The 'Survey of Ontology Editors' covers software tools that have ontology editing capabilities and are in use today. The tools may be useful for building ontology schemas (terminological component) alone or together with instance data. Ontology browsers without an editing focus and other types of ontology building tools are not included. Otherwise, the objective was to identify as broad a cross-section of editing software as possible. The editing tools are not necessarily production level development tools, and some may offer only limited functionality and user support. Concise descriptions of each software tool were compiled and then reviewed by the organization currently providing the software for commercial, open, or restricted distribution. The descriptions are factored into a dozen different categories covering important functions and features of the software... Despite the immaturity of the field, we were able to identify a surprising number of ontology editors -- about 50 overall..." Local version.
[November 11, 2002] "Standards for Electronic Instructional Materials." By Alan Kotok. From XML.com. November 06, 2002. ['Following up on our recent look at Digital Talking Books, Alan Kotok reports this week on U.S. legislative proposals to implement an electronic standard for learning materials: such a standard, likely to be XML-based, could radically improve the lot of visually impaired children at school.'] "Most people with visual impairments rely on methods like Braille to consume material offered in books and other printed media. However, the availability of these materials is limited. Visually disabled school children have an even more extreme problem due to the scarcity of instructional materials in a format they can consume. A bill in the US Congress now addresses these issues for blind students, and if it becomes law, XML will likely play a key role in its implementation. The ability to create and capture text and images in electronic form creates the potential to create learning materials more quickly and easily than before. Ken Pittman's review of Digital Talking Book (DTB) technology in XML.com outlined the standards and critical role of XML behind the DAISY technology on which DTB is based. The publishing industry, advocates for the blind, and some states are already at work to put legislative muscle behind these developments, at least for classroom materials... The bill in question is called the Instructional Materials Accessibility Act (IMAA) of 2002. Senator Christopher Dodd of Connecticut, introduced the bill (S. 2246) in April 2002, with 22 Senate co-sponsors from both parties. In July 2002 Rep. Thomas Petri of Wisconsin submitted a similar bill in the House of Representatives (H.R. 4582), with 84 co-sponsors also crossing party lines..." See: "NISO Digital Talking Books (DTB)."
[November 11, 2002] "Automatic Numbering, Part 1." By Bob DuCharme. From XML.com. November 06, 2002. ['The November edition of Bob DuCharme's "Transforming XML" column. In this installment, Bob digs into using <xsl:number> for adding automatic numbering to your document.'] "XSLT's xsl:number instruction makes it easy to insert a number into your result document. Its value attribute lets you name the number to insert, but if you really want to add a specific number to your result, it's much simpler to add that number as literal text. When you omit the value attribute from an xsl:value-of instruction, the XSLT processor calculates the number based on the context node's position in the source tree or among the nodes being counted through by an xsl:for-each instruction, which makes it great for automatic numbering. Eight other attributes are available to tell the XSLT processor how you want your numbers to look... Next month, we'll see how to number the different chapter, sect1, and sect2 elements as 1., 1.1, 1.1.1, and so forth, with numbering levels restarting at appropriate places. We'll also see how to number the pictures in the book automatically, both as one sequence that never restarts at '1' and also as a sequence that restarts with each new chapter..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."
[November 11, 2002] "Corel Simplifies SoftQuad XMetaL Administration." By Mark Walter. In The Seybold Report Volume 2, Number 15 (November 11, 2002). ISSN: 1533-9211. ['Corel's XMetaL Central manages all the DTDs, style sheets, macros and configuration files for large installations.'] "Recognizing that successful implementation of structured authoring requires managing a host of configuration files, Corel has introduced XMetaL Central, a server and companion Windows client that makes it much easier to centrally administer hundreds or even thousands of XMetaL licenses. When editing an XML file, XMetaL looks for three associated files: a DTD or schema, a style sheet, and a configuration file that specifies the arrangement of the menus. In addition, one might load a set of macros specific to a class of documents. These three to four files must be present for each type of document, and keeping all of these files synchronized across dozens or hundreds of desktops poses a significant management challenge. Corel's response is to store all of these ancillary files on a server. When XMetaL opens a document, a small Windows application (running in the system tray) checks the server to see if the local schema, style sheet and configuration files match those on the server. If not, XMetaL Central downloads the newer versions to the user's computer. The configuration files may be stored on a separate application server or inside a content-management system..." See the announcement "Corel Corporation Introduces Corel XMetaL-Central. New Server Application Drives XML Adoption by Simplifying the Management of Customized XML Authoring Environments."
[November 11, 2002] "Vignette's Complex System Grows Larger in V7. Introduces 'low-end' packaging but retains positions as premier supplier of Web-centric content management." By Luke Cavanagh and Mark Walter. In The Seybold Report Volume 2, Number 15 (November 11, 2002). ISSN: 1533-9211. Content Management. ['An established supplier in high-end Web-content management, Vignette now needs to work its way downmarket by making its systems easier for small shops to buy and use. Version 7 offers some of the right features, but not the right price strategy.'] "Last month at a user conference focused on content management that Vignette hosted in its hometown of Austin, TX, the company introduced Version 7 of its software. The new release features innovations in workflow and content integration and completes the product's transition to the J2EE architecture. It will be available in three tiers, including, for the first time, an introductory package designed for departmental deployments... V7 introduces a visual tool, the Content Integration Workbench, for creating connections between external sources (just about any application with a COM or JAVA programming interface) and the Vignette repository. Vignette has about 75 connectors completed and offers its customers a developer's kit if they want to write their own... Vignette has modified the user interface and workflow-design facets of the system. The new 'Command Center' interface unifies what had been several client interfaces for administration, reporting and end-user functions...According to Vignette, systems administrators will find Command Center also easier to configure for different roles. It is the first application written on top of the V7 'extensibility framework,' Vignette's new XML-based middleware layer that replaces the Vignette Application Framework of release 5... The [workflow] system can now dynamically determine what groups or individuals are available to receive new tasks (though it does not yet monitor individual task lists and route material according to people's workloads). At the same time, now that V7 workflow definitions will be written in XML, they'll be more in keeping with emerging standards... Several days after the release of V7, Vignette raised eyebrows with the surprise announcement that it intends to acquire portal-software supplier Epicentric for $32 million in cash and equities. The merger, expected to close by the end of 2002, gives Vignette a set of tools that none of its competitors offer. Vignette is the first of the enterprise content-management vendors to acquire a portal-software company..."
[November 11, 2002] "XML Zooms Onto Government Tech Agenda." By Lia Steakley. In Wired News (November 11, 2002). "As improbable as it may seem, declining sales among U.S. automakers have clinched government support for XML standards. The American automotive slump continued in October as Ford (F), General Motors (GM) and DaimlerChrysler (DCX) all reported a 30 percent drop in sales. The federal government hopes to rescue carmakers and several other industries with the Enterprise Integration Act of 2002, signed into law last week. Rep. Jim Barcia (D-Mich.) drafted the act after reading a report by the National Institute of Standards and Technology that showed interoperability problems caused by data-quality errors within the automotive supply chain were costing the industry $1 billion a year. Seeking to cure industries' interoperability woes, protect U.S. companies' profits and shave billions of dollars off manufacturing costs, the law authorizes the director of the standards institute to harness the Internet as a tool for manufacturers. And XML could be the key technology in making that happen. 'The dream of seamless interoperability has been chased for many years,' said Ric Jackson, a director at the NIST. 'XML is the latest and one of the most successful approaches. It may be the one that leads us to the grail.' Barcia had that in mind when he drafted the bill. 'One of the reasons we pushed for this at this time is because XML can deliver the pipelines needed to transmit the three-dimensional drawings that are so crucial to product development,' said Barcia staffer Jim Turner. The law calls for $47 million to be dispensed over the next three years to standards groups to accelerate projects in progress... RosettaNet, a consortium working to create and implement industry-wide open e-commerce standards, is one of many groups hoping to get a helping of the funds. Paul Tearnen, RosettaNet vice president of standards management, said two projects currently underway -- the consortium's technical dictionary and compliance programs -- stand to benefit from the act... One of the act's goals is to allow the U.S. auto industry to keep pace with Europe and Asia, where carmakers have already embraced XML and the Internet..." See "RosettaNet."
[November 11, 2002] "GIS Group Advances Info-Sharing Project. Open GIS Consortium, Census Bureau Work on Prototypes for Sharing Geospatial Data." By Brian Robinson. In Federal Computer Week (November 10, 2002). "The Open GIS Consortium Inc. (OGC) this month expects to launch the next stage of an initiative to help federal, state and local governments share information about systems of vital interest to national security. OGC expects to announce participants for the second phase of the pilot program of its Critical Infrastructure Protection Initiative (CIPI), with hopes of having systems to demonstrate by April. Through CIPI, OGC is developing a network via which different jurisdictions can share geospatial information about power plants, telecommunications networks and other core systems. The first CIPI phase, CIPI-1, began in October and is focused on creating an underlying system for CIPI applications, called the Critical Infrastructure Collaborative Environment. CIPI-2, sponsored by the U.S. Census Bureau, will result in two prototype applications: WebBAS, an online Boundary and Annexation Survey (BAS) that updates information on government boundaries collected from state, county and local governments; and a server solution for delivering Topologically Integrated Geographic Encoding and Referencing (TIGER) data via the Web for use by the public and organizations in compiling their own versions of maps... TIGER data, which is used to build maps, is currently delivered online, he said, but uses a proprietary format that has to be updated every few years, which is a cumbersome process. An OGC-compliant server solution will use open standards such as Geography Markup Language (GML)... [said David Sonnen, senior consultant for spatial data management at IDC]: 'The issues that OGC is tackling will show how GML and other GIS-specific geometry and text formats will manage that translation, he said, "and it's not a trivial thing to do'..." See: (1) the Open GIS Consortium website, and (2) "Geography Markup Language (GML)."
[November 11, 2002] "XML Data Islands and Persistence in ASP.NET." By Rahul Guha (Senior Software Engineer, Intel e-Business Architecture Group). Intel Technical Report. November 2002. 12 pages. ['This article illustrates using persistent data islands.'] "Sometimes it's advantageous to separate data representation from the data itself, creating 'data islands.' For example, you can separate the user interface (display elements) in a XSLT file and transform the XML file (stream) through that XSLT file, which builds the HTML that is understood by browsers. A single page can access multiple data islands (such as an XML file, database source, and so forth) that each require a different XSLT file. The beauty of this design is that if you need to change any aspect of the data representation, you simply change a specific XSLT without touching the logic or main code block. This is a clean and easy way to separate presentation from data handling. Once you have data represented in the page from multiple sources of data though separate XSLTs, you may need to persist the XML data blocks (islands), make changes to the data, and save the changes in one process when the user submits the page... The Microsoft .NET Framework includes the XML Control that specifically supports data representation. Use DOMDocument (Document Object Model) to persist the data islands in the client and make changes. Submit the updated XML stream to the server where it is saved to the database. ... The example uses one Container page that includes two XML Web server controls. The controls are attached to separate XML strings, which are created from datasets filled from two separate database calls. The database calls can be to any type of data source. The XML streams are persisted in the client page in the form of hidden variables so that they can be accessed from the client-side code. Two separate XSLT files are attached to the XML controls, which are independent of each other and can incorporate any user interface elements... This solution offers the following advantages: (1) Separates the data from the presentation code, which provides flexibility. (2) Supports multiple data sources integrated into one page using the same or different presentation code. (3) Saves trips to the server every time the user makes a change. Data access is handled in one shot when the page is submitted. (4) The server processes only changed data..." For related resources, see "Extensible Stylesheet Language (XSL/XSLT)."
[November 11, 2002] "SAML Approval Brings Secure Web Services a Step Closer." By Ray Wagner, John Pescatore, and Terry Allan Hicks. From Gartner News Analysis. Gartner FirstTake, #FT-18-7741. 7-November-2002. ['The standards body OASIS approved Security Assertion Markup Language (SAML). Its widespread use will aid the creation of secure, interoperable Web services, but remaining challenges will require significant investments.'] "OASIS announced that it has approved SAML, an XML security standard. SAML enables cross-domain authentication and authorization and single sign-on, and forms the technical basis for the Liberty Alliance federated identity initiative. The newly approved SAML standard will play a central role in Web services deployments because it supports complex workflow and new business models. In addition, SAML can encapsulate complex information for the multiple domains that characterize emerging Web services models. Most Web services vendors have announced plans to support SAML in the near future, and this widespread acceptance will simplify security integration across heterogeneous Web services environments. Although Gartner forecasts rapid adoption of SAML, enterprises implementing Web services will still face serious security challenges, particularly in managing the public and private keys required to implement signing and encryption. SAML and the other leading Web services security initiatives... all assume that keys or digital certificates and the infrastructure to manage them are readily available. This is not yet the case for most enterprises, however. The XML Key Management Specification (XKMS) does offer a simplified approach to integrating public key management capabilities with applications. However, enterprises and vendors must still create the infrastructure for effective long-term management of keys and certificates within the enterprise. The failure of public-key infrastructure to achieve significant market penetration means that enterprises typically lack the capacity to effectively use Web services platforms that apply the new standards. Enterprises should plan to make investments in the necessary base infrastructure and should demand that vendors' Web services offerings support XKMS public-key management capabilities as well as SAML, XML encryption and signing, and WS-Security (when approved)..." See: (1) the announcement: "Security Assertion Markup Language (SAML) Ratified as OASIS Open Standard. Authentication and Authorization Standard Enables Single Sign-On for Web Services."; (2) "Security Assertion Markup Language (SAML)." [Also in HTML format]
[November 11, 2002] "Interactive Dialog Among Governments, Business Sector and Civil Society." By Patrick J. Gannon (President & CEO, OASIS). Presented at the WSIS Pan-European Regional Conference Plenary Meeting, Bucharest, Romania, November 7-9, 2002, organized by the Government of Romania, United Nations Economic Commission for Europe, and International Telecommunication Union. The presentation addresses topics on Multi-Stakeholder Commitments, Definition of e-Strategies, The Government's Role in Standards, and Initiatives in Public-Private Sector Partnering. "[A key strategic approach dictates that] Governments, business, and civil society should collaborate in defining requirements and conducting implementations of open global standards for information exchange and that Governments adopt open, global standards for information exchange. To drive standards effectively, Government needs to interface with private industry, interface with Voluntary Standards Development Organizations (VSDO), and endorse standards developed through open process with market adoption. [The paper presents] a Standards-Based Framework for a Single European Electronic Market (SEEM). A Standards Framework for SEEM should provide: (1) Open, global standards, protocols and interfaces; (2) Interoperable applications and platforms; (3) A trusted and sustainable infrastructure; (4) Compatibility between business practices [catalog information exchange, payment methods, security]... One example of a Public-Private Sector Partnering initiative is an 'E-Government Business Repository' supporting exchanges, e-markets, supply chains need combinations of multiple Web Services. The E-Government Business Repository will build upon the existing legal process for registering businesses within the defined governmental territory. In auction, pricing, tax computation, customs, RFQ, order management, and content management, no two exchanges, e-markets, or supply chains will use the same service combination and particular service packages. The complexity of this requires a common foundation of business data and open interfaces (standard interface for globally distributed repositories, minimal engineering effort to connect different sites). The E-Government Business Repository offers multiple implementation benefits: (1) E-Gov Business Repository provides single, trusted, sustainable resource for basic information on every registered business within a governmental region; (2) Through ebXML or UDDI interfaces, company info can be shared inter-operably with other organizations (Other governmental repositories, Chambers of Commerce, Certificate Authorities - PKI Digital Certificates, Trade Associations, E-Marketplaces); (3) Business repositories provide a foundation for rapid adoption of electronic business..." The Bucharest WSIS Pan-European Regional Conference is one of four regional preparatory conferences in advance of the first World Summit on the Information Society (WSIS) to be held December 10-12, 2003, in Geneva, Switzerland. See also the summary on "e-Government: The Role of Governments in the Development and Adoption of International Standards for the Information Society." [source .PPT]
[November 08, 2002] "Web Services Standardization Advances." By Paul Krill. In InfoWorld (November 08, 2002). "Web Services standardizatioN is advancing on several fronts, including choreography and intellectual property concerns, a World Wide Web Consortium (W3C) official said in an interview this week... Dave Hollander, chairman of the W3C Web Services Architecture and XML Schemas working groups, stressed that standardization is needed to avoid a situation similar to what happened with the Web browser face-off between Microsoft and Netscape of a few years ago. Once the industry settles on standardization for Web services ingredients such as messaging and security, the industry can move forward with advanced concepts such as the semantic Web, a more intelligent Web that requires less human intervention for transactions, he said... Hollander said that between SOAP, WSDL, and UDDI, there already is a good foundation for having systems communicating with each other over the business Internet... The XML Schema Working Group has just finished the second edition of the XML Schema Description language, which cleans up technical errors in documentation. The group also is looking at requirements for Version 1.1 of XML Schema, according to Hollander. XML Schemas express shared vocabularies and allow machines to carry out rules made by people, according to the W3C. The schemas provide a means for defining the structure, content, and semantics of XML documents..."
[November 08, 2002] "Plan to Use XML Namespaces, Part 1. The Best Ways to Use XML Namespaces to Your Advantage." By David Marston (Engineer, IBM Research). From IBM developerWorks, XML Zone. November 2002. ['This article introduces XML namespaces, explores their practical benefits, and shows you how they are used in the standard XML formats and tools defined by the W3C. Several W3C specifications are mentioned, notably XML Schema and XSLT, which offer useful ideas for using namespaces to your advantage. Best practices range from terminology usage up through system-wide design. This document mentions changes proposed up through the September 2002 "Last Call Working Draft" of version 1.1 of Namespaces in XML.'] "Most business and communications problems that XML can solve require a combination of several XML vocabularies. (You may read tag and attribute sets in place of the term XML vocabularies if you wish.) XML has a mechanism for qualifying names to be allocated into different namespaces, such as namespaces that apply to different industries. A company or (better yet) an industry consortium can assign names to elements and use common words like 'title' or 'state' without worrying that those names will clash with the same names used in another vocabulary. XML namespaces also allow names to evolve over time. After using the first version of a vocabulary, your real-world experience may lead you to devise an enhanced vocabulary. The new version can be assigned to a different namespace, and you can use XSLT to transform data from one vocabulary to the other... XML namespaces also allow various tools that process XML, such as a stylesheet-driven XSLT processor, to pick out the instructions they should obey and treat instructions for other processors as just more data. The processor is set up to consider elements from a particular namespace (or two) to be the instructions. Elements that have no namespace are data, as are all elements that have a namespace other than those recognized as instructions... Part 2 provides more depth on the best way to establish your own XML vocabularies. In Part 2, you'll also see renaming techniques that are namespace-aware..." See "Namespaces in XML."
[November 08, 2002] "Make Your XML RDF-Friendly." By Bob DuCharme and John Cowan. From XML.com. October 30, 2002. ['The describe how to structure XML documents so that they can be used by RDF processors. The authors explain: "as RDF interest and application development grows, there's an increasing payoff in keeping RDF concerns in mind along with the other best practices as you design document types".'] "Suppose you're designing an XML application or maybe just writing a DTD or schema. You've followed various best practices about element and attribute names, when to use elements versus attributes, and other design issues, because you want your XML to be useful in the widest variety of situations. As RDF interest and application development grows, there's an increasing payoff in keeping RDF concerns in mind along with the other best practices as you design document types. Your documents store information, and small tweaks to their structure can allow an RDF processor to see that information as subject-predicate-object triples, which it can make good use of. Making your documents more 'RDF-friendly' -- that is, more easily digestible by RDF applications -- broadens the range of applications that can use your documents, thereby increasing their value. A lot of XML RDF documents look like they were designed purely for RDF applications, but that's not always the case. The frequent verbosity of RDF XML, which often intimidates RDF beginners, is a by-product of the flexibility that makes RDF easy to incorporate into your existing XML. By observing eight guidelines when designing a DTD or schema, you can use this flexibility to help your documents work with RDF applications as well as non-RDF applications. Some of the guidelines are easy, while some involve making choices based on trade-offs. But knowing what the issues are gives you a better perspective on the best ways to model your data... As RDF tools become more widely available and easy to use, you'll have more resources available to do improved metadata management for your own data. Even if you're not ready to build serious RDF applications just yet, making more of your own data RDF-friendly will do more than widen the number of applications that can use it. For many people, the kinds of things that RDF is good at become clearer to them when used with data that is important to their business or important to them personally, such as an address or appointment file. Using RDF tools to play with your own data will help you understand the strong points of RDF and, perhaps, even the strong points of your own data better..." See W3C Resource Description Framework (RDF) website and Semantic Web Activity Statement. General references in "Resource Description Framework (RDF)."
[November 08, 2002] "Community and Specifications." By Kendall Grant Clark. From XML.com. October 30, 2002. ['The advent of XML 1.1 and other new specifications from the W3C has prompted new rounds of vigorous analysis. Kendall Clark reports this week, in his XML-Deviant column, on the effects of XML 1.1 on SAX, and investigates what the criteria are for a successful specification.'] "Some of the changes which generated developer interest in XML 1.1 include Unicode character normalization, new permissible control characters, and new line-ending rules. The pressures to migrate to XML 1.1 are likely to be greatest in the case of XML applications which primarily consume XML. Finally, I suggested that the key to XML 1.1 migration is the XML infrastructure vendors, particularly XML parser providers. In this week's column I return to pick up a bit more of the community's debate about XML 1.1 before reviewing several other matters, including XInclude security and what processes or methodologies make for good XML specifications..."
[November 08, 2002] "XML and Web Sites." By John E. Simpson. From XML.com. October 30, 2002. "John Simpson's XML Q&A column considers what is required to build an XML-based web site. John provides a useful set of links to web sites, books and software, including some interesting pointers even experienced XML hands may not have seen before."
[November 08, 2002] "Straight-Through Processing and Orchestration of Web Services. A Paradigm to Achieve Internal and External STP." By Gunjan Samtani and Doron Sherman. In Web Services Journal Volume 2 Issue 11 (November 2002), pages 10-14. "This article presents a paradigm to achieve internal and external STP through the orchestration of Web services. We discuss the fundamentals of STP, introduce the concept of orchestration, relate how business-critical STP processes can be orchestrated as Web services, and envision building the entire STP model over a service-oriented architecture (SOA)-based framework... Straight-through processing, a solution that automates the end-to-end processing of transactions for all financial instruments, from initiation to resolution, is positioned to revolutionize the financial industry. STP encompasses a set of internal and external applications, business processes, and standards that will redefine the settlement and processing paradigm within the capital markets industry. It aims to make trade processing as automated as possible, allowing STP-related business processes to be carried out without unnecessary human intervention, thereby reducing to a minimum the overall processing lead time and the related risks, including inevitable human errors... The key orchestration requirements for STP will include identity management, stateful asynchronous interactions, flow coordination, business transaction management, and activity monitoring. Applying orchestration effectively to STP requires a dramatic reduction in the complexity associated with the above requirements needed for deploying and managing distributed business processes. This reduction in complexity will be achieved through the use of an orchestration server, which will be essential for addressing the critical requirements of orchestration for STP... An SOA-based framework can enable financial companies to achieve their business goals by providing a service-based platform to integrate new and existing applications and systems with STP functionality, implementing industry standards, and building an infrastructure that would support the dynamic financial messaging required for continuous processing for all types of financial instruments. Service-oriented architecture can provide the foundation and Web services can provide the building blocks for application architecture in order to achieve seamless trade processing... An SOA-based framework can provide support for multiple XML standards, such as ISO15022 and FpML, at the same time, and provide additional standards support without significant redevelopment effort. Using Web services as an enabling technology, STP-related problems and issues will shift from connectivity among different applications in-house and with trading partner applications to the content and structure of the information that is exchanged. The analogy here will be that Web services will define the standard postal mechanism along with the envelope and addressing format for exchanging letters. What is inside the envelope (the content of the letter) will be defined by the XML-based business process standard, such as ISO 15022 XML..." On STP, see (1) "swiftML for Business Messages"; (2) "Financial Information Exchange Protocol (FIX)"; (3) "Financial Products Markup Language (FpML)." [alt URL]
[November 08, 2002] "Sun ONE Architecture: Why Care? All the Standards, Technologies, and Products Needed to Support the Pioneering Web Services That Are Being Built Today." By Karen Thure and Frank Lacombe (KnowledgeTree Systems Inc). In Web Services Journal Volume 2 Issue 11 (November 2002), pages 22-25. "A software developer will find it useful to learn the Sun ONE architecture and discover how it can solve integration and interoperability problems. Its associated technologies and products provide a wide range of automated services that significantly lower the costs of building Web applications and clients. This article provides an overview of the architecture... The Sun ONE architecture consists of an integrated stack of standards and technologies in three layers. In the top layer are the elements used to create, assemble, deploy, and test Services on Demand. At the bottom are identity, security, policy, and management, along with hardware/software platform support. The middle of the stack is the center of the Sun ONE architecture: Service Delivery (presentation logic), Service Container (business logic), and Service Integration (back-end data-access logic). A major part of Sun ONE security consists of Identity and Policy Services, which are near the bottom layer of the stack. Interacting with most of the higher-level components, they include these broad categories of services: (1) Identities, roles, and security for users, groups of users, and other system objects; (2) Federated identity systems such as the Liberty Alliance Project; (3) Management services, which include both systems and applications management. Within the Sun ONE architecture, the Sun ONE Identity Server provides an identity system that includes access management, identity administration, and directory services. The Sun ONE Directory Server serves as a central repository for storing and managing identity profiles, access privileges, and application and network resource information. Built on top of the Directory Server, the UDDI-based Registry Server lets enterprises register Web services, thus allowing their services and business processes to be identified, described, and integrated on the Internet. Other Sun ONE platform services provide authentication, Web single sign-on, identity and policy management, logging, and audit... For a hands-on introduction to building Web services, many experienced Java technology developers are downloading the Java Web Services Developer Pack (Java WSDP). This contains the Java XML APIs along with a set of ready-to-use tools necessary for building, testing, and deploying Web applications, XML Web applications, and Web services on the Java platform..." [alt URL]
[November 08, 2002] Web Services Security, Part II. The Key is Unification." By Lakshmi Hanspal (Quadrasis - Hitachi Computer Products). "In Web Services Journal Volume 2 Issue 11 (November 2002), pages 18-21. "In the first part of this series (WSJ Volume 2, Issue 10), I discussed traditional approaches to securing Web services and the shortfalls of these approaches, demonstrating the need for a comprehensive, standards-based security framework with a purpose-built solution, such as SOAP Content Inspection, to completely secure Web services. In this article, I further explore the world of federation in B2B Web services. Distributed-component computing allows the sharing of information among enterprises. But enterprise security policies are likely to be different (say, between a hospital and a bank), which means that data sharing requires translations between enterprise policies. We will also discuss how a security framework approach using the Web Services Proxy unifies federated security... Enterprise Application Security Integration (EASI) is a standards-based, vendor-neutral security framework that unifies the patchwork of security products and services deployed within the enterprise. Current and emerging XML Web services security standards include: (1) WS-Security for SOAP message security; (2) SAML (Security Assertion Markup Language) for exchange of user credentials between components; (3) The Liberty Alliance Project for federated network identity and single sign-on and sign-out. A key service of the EASI Framework is the Web Service Proxy (WS Proxy), a flexible, standards-based solution that secures SOAP-based transactions for a range of enterprise B2B applications. The WS Proxy uses services provided by the underlying EASI Framework. Its combination of services supports the three principles of Web services security: [trust no one, enable interoperability, and modularize security]... Web services demand attention to federation of services and security. Developing federated security policy support for Web services is an evolving part of EASI. Comprehensive SOAP/XML message security is more than encryption. It requires an application-level security gateway as a flexible, transparent solution that performs message analysis as well as authentication, authorization, and auditing services to protect business-critical Web services. The EASI Framework, along with the WS Proxy, enables organizations to leverage their existing investments and easily develop new business relationships with suppliers, vendors and customers, while ensuring adherence to corporate policies to achieve unified security in federated environments. WS Proxy provides comprehensive inspection of messages (message validation, message integrity, and message-origin authentication). Critical integration with enterprise security services (authentication, authorization, audit, SAML interoperability) and SOAP security ensure end-to-end protection and assurance..." [alt URL]
[November 06, 2002] "Q&A: Dr. Ivan Walks on the Need For Info-Sharing." By Dan Verton. In Computerworld (November 06, 2002). ['Last month, Los Angeles-based E-Team Inc., a developer of collaboration software for emergency management, hired Dr. Ivan Walks, the former chief health officer for Washington, to help the company develop a comprehensive IT-based biodefense program for government and the private sector. Walks directed the response to last year's anthrax attacks, and in an interview with Computerworld, he talked about the challenges of sharing information during terrorist incidents and crises.'] "[with respect to incidents and crises] very tough choices are being made about shoring up existing infrastructure vs. development of new infrastructure. And information-sharing tends to fall in the new infrastructure category. When you look at information-sharing and the practical use of technology, there also needs to be agreement on what the data set looks like. Now we're back to the cultural divide between the first responders and the other folks that need to be involved, such as public health, property managers and the private sector... There is tremendous innovation going on with respect to health technology. One of the things that E-Team has done is to bring together an e-XML [EM-XML] Consortium to promulgate new standards for data-sharing. If you talk to people who do emergency management for a living, they will tell you that by the time you are finished with the first 24 hours of any major disaster, you need to look outside of your own agency for 80% to 90% of their resources and information. The private sector clearly plays a major role, and as such, needs to be a technology partner with the public sector..."
[November 05, 2002] "XQuery: An XML Query Language." By Donald Chamberlin. In IBM Systems Journal Volume 41, Number 4 (2002), pages 597-615 (with 20 references). "The World Wide Web Consortium has convened a working group to design a query language for Extensible Markup Language (XML) data sources. This new query language, called XQuery, is still evolving and has been described in a series of drafts published by the working group. XQuery is a functional language comprised of several kinds of expressions that can be nested and composed with full generality. It is based on the type system of XML Schema and is designed to be compatible with other XMLrelated standards. This paper explains the need for an XML query language, provides a tutorial overview of XQuery, and includes several examples of its use... XQuery expression-types include path expressions, element constructors, function calls, arithmetic and logical expressions, conditional expressions, quantified expressions, expressions on sequences, and expressions on types. XQuery is defined in terms of a data model based on heterogeneous sequences of nodes and atomic values. An instance of this data model may contain one or more XML documents or fragments of documents. A query provides a mapping from one instance of the data model to another instance of the data model. A query consists of a prolog that establishes the processing environment, and an expression that generates the result of the query. Currently, XQuery is defined only by a series of working drafts, and design of the language is an ongoing activity of the W3C XML Query Working Group. The working group is actively discussing the XQuery type system and how it is mapped to and from the type system of XML Schema. It is also discussing full-text search functions, serialization of query results, errorhandling, and a number of other issues. It is likely that the final XQuery specification will include multiple conformance levels; for example, it may define how static type-checking is done but not require that it be done by every conforming implementation. It is also expected that a subset of XQuery will be designated as XPath Version 2.0 and will be made available for embedding in other languages such as XSLT... Just as XML is emerging as an application-independent format for exchange of information on the Internet, XQuery is designed to serve as an applicationindependent format for exchange of queries. If XQuery is successful in providing a standard way to retrieve information from XML data sources, it will help XML to realize its potential as a universal information representation..." See: (1) W3C XML Query website; (2) "XML and Query Languages." [cache]
[November 05, 2002] "XTABLES: Bridging Relational Technology and XML." By John E. Funderburk, Gerald Kiernan, Jayavel Shanmugasundaram, Eugene Shekita, and Catalina Wei. In IBM Systems Journal Volume 41, Number 4 (2002), pages 616-641 (with 35 references). "XML (Extensible Markup Language) has emerged as the standard data-exchange format for Internet-based business applications. These applications introduce a new set of data management requirements involving XML. However, for the foreseeable future, a significant amount of business data will continue to be stored in relational database systems. Thus, a bridge is needed to satisfy the requirements of these new XMLbased applications while still using relational database technology. This paper describes the design and implementation of the XTABLES middleware system, which we believe achieves this goal. In particular, XTABLES provides a general framework to create XML views of relational data, query XML views, and store and query XML documents using a relational database system. Some of the novel features of the XTABLES architecture are that it (1) provides users with a single XML query language for creating and querying XML views of relational data, (2) executes queries efficiently by pushing most computation down to the relational database engine, (3) allows users to query seamlessly over relational data and meta-data, and (4) allows users to write queries that span XML documents and XML views of relational data... XTABLES exposes relational data as an XML view and also allows users to view native XML documents as XML views. Users can then query over these XML views using a general-purpose, declarative XML query language (XQuery), and they can use the same query language to create other XML views. Thus, users of the system always work with a single query language and can query seamlessly across XML views of relational data and XML documents. They can also query relational data and meta-data interchangeably. In addition to providing users with a powerful system that is simple to use, the declarative nature of user queries allows XTABLES to perform optimizations, such as view composition and pushing computation down to the underlying relational database system... We believe that the XTABLES system architecture can serve as the foundation for pursuing various avenues of future research. One such area is providing support for emerging XML query language features, such as updates. We have made some initial progress toward this goal by providing the ability to store or 'insert' XML documents into a certain class of XML views. However, the development of a general theory of 'updatable XML views' is an open research problem. Another interesting problem that arises in the context of XML query languages is providing support for information retrieval style queries. These are especially important for querying native XML documents." See also "From Data Management to Information Integration: A Natural Evolution," by Mary Roth and Dan Wolfson (DBTI for e-Business, IBM Silicon Valley Lab, June 2002). [cache]
[November 05, 2002] "The XPointer content-type() Scheme." By Simon St.Laurent (O'Reilly & Associates); [WWW]. IETF Network Working Group, Internet-Draft. Reference: 'draft-stlaurent-content-type-frag-00.txt'. October 28, 2002, expires April 28, 2003. "This document specifies a content-type() scheme for use in XPointer-based fragment identifiers. This scheme, like other XPointer Framework schemes, is designed primarily for use with the XML Media Types defined in RFC 3023, to identify locations within a given XML representation of a resource, though it may potentially be used to mix schemes for XML and non-XML representations. The content-type() scheme notifies an XPointer processor whether the creator of the XPointer intended for a particular pointer part to apply to a resource representation which uses a particular MIME content type identifier..." See also the updated "XPointer xpath1() Scheme."
[November 05, 2002] "The XPointer xmlns-local() Scheme." By Simon St.Laurent (O'Reilly & Associates); [WWW]. IETF Network Working Group, Internet-Draft. Reference: 'draft-stlaurent-xmlns-local-frag-00.txt'. October 28, 2002, expires April 28, 2003. "This document specifies an xmlns-local() scheme for use in XPointer-based fragment identifiers. This scheme, like other XPointer Framework schemes, is designed primarily for use with the XML Media Types defined in RFC 3023, to identify locations within a given XML representation of a resource. The xmlns-local() scheme notifies an XPointer processor that it should include all of the namespace binding context defined for the element containing the XPointer in the namespace binding context for the XPointer... Justification: The XPointer Framework permits the creation of namespace binding contexts to be used with later parts of the XPointer, but the xmlns() scheme used to support those contexts combines some pointer part overhead with already verbose URIs to create fragment identifiers that can easily be longer than the rest of the URI reference. In situations where multiple namespaces are needed to identify components in an XPointer, the result can be grotesque if not simply unreadable. This specification allows XPointer to take advantage of a feature (some would say mistake) of the Namespaces in XML specification: an understanding of an inherited namespace context. A set of bindings between namespace URIs and prefixes can be defined for any given element in a document, using the namespace declarations in that element and in all of its parents. By reusing the existing namespace context, XPointers can be much smaller than is currently possible..."
[November 05, 2002] "UDDI Spec Needs What LDAP Has." By Anne Chen. In eWEEK (November 04, 2002). "You may think LDAP was yesterday's news, but the spotlight is hitting it again as vendors work to tie it to the UDDI standard. Developed in 2000 by Ariba Inc., IBM, Microsoft Corp., Intel Corp. and SAP AG, the Universal Description, Discovery and Integration specification describes Web-based services and the interface required so that businesses and business applications can interact across the Internet. Much work is being done to tie LDAP to UDDI, especially since LDAP's strengths-- which include maturity, reliability, scalability and security-- are key attributes lacking in current Web services standards, experts say. On the standards front, Novell Inc. issued a submission to the Internet Engineering Task Force aiming to formalize the role of LDAP in Web services by extending the schema of LDAP and outlining how to represent UDDI data in an LDAP directory. On the product front, Sun Microsystems Inc.'s Sun Open Net Environment, or Sun ONE, Registry Server will allow IT managers to store UDDI repositories in an LDAP directory. Novell also has plans to deliver a UDDI server product as part of its next-generation eDirectory platform sometime next year, said Alan Nugent, Novell's chief technology officer, in Cambridge, Mass...Although few will argue that UDDI is the next big thing, not everyone sees LDAP as the key to UDDI's success. Industry heavyweights such as IBM and Microsoft, for example, both have a major implementation of UDDI running on top of their respective relational databases. IBM's WebSphere UDDI Registry relies on the company's DB2 database, while Microsoft's integration of a UDDI server into its .Net server software relies on SQL Server..."
[November 05, 2002] "Java Execs Eye This Week's J2EE 1.4 Beta Release." By Vance McCarthy. From Integration Developer News (November 04, 2002). ['This week, Sun makes the beta release of the J2EE 1.4 specification available for download. While the core spec sports many enhancements, the key to this release is Java's support for web services standards SOAP, XML and WSDL. With this version, Java developers should get ready to be looking at their development projects in a new way, say Java professionals -- both from Sun and at a variety of independent ISVs and consulting firms... To give developers a sense for how J2EE 1.4 will shape the future of Java developers in a web services world, Integration Developer News spoke in depth with two of Sun's leading voices on J2EE: George Grigoryev, J2EE senior product manager, and Glen Martin, senior marketing manager for J2EE and web services.'] "... We will be releasing J2EE 1.4 beta coming this week. Anybody can get it at http://java.sun.com/j2ee. Our goal is that there will be one beta. The FCS [First Customer Ship] will come in Q1 next year, when tests are complete. There are 25,000 tests for 1.4, compared to 15,000 for Java 1.3. The FCS includes a binary of the apps [server?] code and a text of the spec. The folks that download will develop little things and try them out, as well as read and review the documentation of the spec... The purpose of the web services developer pack (WSDP) now being added to J2EE 1.4 is to bring Java developers to J2EE. In J2EE 1.4, the web service developer pack will have [standard] portability. There are a lot of vendors implementing web services now [in advance of J2EE 1.3], but we need a way to retain the portability of those approaches across platforms. The Java Web Services Developer Pack is designed as an SDK for developers to try technologies, and it is designed to deliver APIs We're not in a position to be driving protocols. Are we going to come up with a JAX-RPC before SOAP? Of course not..."
[November 05, 2002] "Web Services on Wall Street -- Inside STP." By Gunjan Santami. From Integration Developer News (November 04, 2002). Case Study. "Straight-through Processing (STP), a solution that automates the end-to-end processing of transactions for all financial instruments from initiation to resolution, is set to revolutionize the financial industry. STP will streamline back-office activities, leading to reduced failures, lower risks, and significantly lower costs per transaction. It encompasses a set of applications, business processes, and standards that will redefine the settlement and processing paradigm within the capital markets industry... A SOA-based framework is capable of providing support for multiple XML standards, such as ISO 15022 [see ISO Working Group 10 - ISO WG 10] and FpML, at the same time, and adding additional standards support without significant redevelopment effort... With the use of Web Services as an enabling technology, STP-related problems and issues will shift from connectivity among different applications in-house and with trading partner applications to the content and structure of the information that is exchanged. The analogy here will be that Web Services will define the standard postal mechanism along with the envelope and addressing format for exchanging letters. What is inside the envelope (the content of the letter) will be defined by the XML-based business process standard, such as ISO 15022 XML..." See: (1) "swiftML for Business Messages"; (2) "Financial Information Exchange Protocol (FIX)"; (3) "Financial Products Markup Language (FpML)."
[November 05, 2002] "YAML Ain't Markup Language (YAML) 1.0." Working Draft 31-October-2002. Edited by Oren Ben-Kiki, Clark Evans, and Brian Ingerson. Version URL: http://yaml.org/spec/31oct2002.html. "YAML (rhymes with 'camel') is a straightforward machine-parsable data serialization format designed for human readability and interaction with scripting languages such as Perl and Python. YAML is designed for data serialization, formatted dumping, configuration files, log files, Internet messaging and filtering. This specification describes the YAML information model and serialization format. Together with the Unicode standard for characters, it provides all the information necessary to understand YAML Version 1.0 and to construct programs to process YAML information. YAML document streams encode in a textual form the native data constructs of modern scripting languages. Strings, arrays, hashes, and other user-defined data types are supported. A YAML document stream consists of a sequence of characters, some of which are considered part of the document's content, and others that are used to indicate document structure. YAML information can be viewed in two primary ways, for machine processing and for human presentation. YAML processor is a tool for converting information between these complementary views. It is assumed that a YAML processor does its work on behalf of another module, called an application. This specification describes the required behavior of a YAML processor. It describes how a YAML processor must read or write YAML document streams and the information structures it must provide to or obtain from the application. This specification is a working draft and reflects consensus reached by the members of the yaml-core mailing list. Any questions regarding this draft should be raised on this list. With this release of the YAML specificiation, we now encourage development of YAML processors, so that the design of YAML can be validated. The specification is still subject to change; however, such changes will be limited to polish and fixing any logical flaws and bugs. Therefore, this is last call for changes; if you have a pet feature now is the very last time that it can be proposed before Release Candidate status."
[November 05, 2002] "XML Matters: YAML Improves on XML. YAML Ain't Markup Language." By David Mertz, Ph.D. (Alternator, Gnosis Software, Inc). From IBM developerWorks, XML zone. October 2002. ['In this article, David introduces you to YAML, a data serialization format that can be easily read by humans and is well-suited to encoding the data types used in dynamic programming languages. In contrast to XML, YAML uses clean and very minimal structural indicators, relying largely on indentation of nested elements. More importantly, for many tasks the superior syntax of YAML is the far better semantic fit between YAML and 'natural' data structures.'] "Although it is no less general than XML, YAML is a great deal simpler to read, edit, modify, and produce than XML. That is, anything you can represent in XML, you can represent (almost always more compactly) in YAML. Namespaces represent one case in which some special pleading is necessary -- you can 'bolt them on' to YAML, but they are not part of the current spec... XML tries to be a document format, a data format, a message packet format, a secure RPC channel (SOAP), and an object database. Moreover, XML piles on APIs for every style of access and manipulation: DOM, SAX, XSLT, XPATH, JDOM, and dozens of less common interface layers; I have contributed a few of my own in the gnosis.xml.pickle, gnosis.xml.objectify and gnosis.xml.validity packages. The remarkable thing is that XML does all these things; the disappointing part is that it does none of them particularly well. YAML's focuses more narrowly on cleanly representing the data structures and data types you encounter in dynamic programming languages like Perl, Python, Ruby, and to a lesser extent, Java programming. Bindings/libraries currently exist for those languages. A number of other languages have data models that will play nice with YAML, but no one has written libraries yet. These include Lisp/Scheme, Rebol, Smalltalk, xBase, and AWK. Less dynamic languages would not fit as well with YAML. The syntax of YAML combines the contextual typing of Perl, the indentation structure of Python, and a few conventions from MIME standards. In much the same way that Python is sometimes praised as executable pseudocode, YAML's concrete syntax comes very close to what you might use to informally explain a data structure to a class or work group..."
[November 05, 2002] "OASIS Steps Up Security Agenda." By Brian Fonseca. In InfoWorld (November 04, 2002). "OASIS is on tap to execute two high-profile moves this week that should bolster the standards consortium's growing influence within the nascent Web services security realm. On Monday [2002-11-04], OASIS announced that it has expanded its organization to include the PKI Forum as its newest Member Section. In addition, OASIS could officially ratify the first version of SAML (Secure Access Markup Language) as early as Wednesday, accelerating adoption and cross-industry use of the authentication and authorization protocol, according to OASIS officials. The marriage between OASIS and the three-year-old PKI Forum security advocacy group will allow OASIS to concentrate future development into the use of PKI as a vital and trusted cog to enable secure e-business transactions involving Web services applications, said Patrick Gannon, president and CEO of OASIS. 'We think by OASIS providing a home [for PKI Forum] it will increase confidence for organizations and companies in the deployment of PKI,' said Gannon. 'It will provide a way for people to view a more seamless adoption of PKI infrastructure and how that fits within the expanding e-business and Web services world.' Dogged by complexity, integration difficulties, and user apathy, PKI -- and vendors such as Entrust, RSA, and Baltimore Technologies that have championed the technology -- have discovered the buyer market to be unkind thus far. However, security experts see future promise for PKI by the assertion signing and management challenges Web services will pose. 'Most of these XML-based security protocols being developed [for Web services] talk about encryption, signing [and] assertions,' said Gerry Gabel, analyst at The Burton Group, in Salt Lake City. 'This cries out for team management and there may be a role for PKI to step out and actually provide value there... However, Gabel said it remains to be seen if the security provider community can pull off the Herculean task of making customers forget about the failed history or shelf-ware remnants of PKI. To find success, he notes, PKI must be woven into the background of customers' security operations where they wouldn't have to install a certificate authority into a directory or have to create a certificate authority, or install any form of client software but rather have it bundled in with a larger security or application offer According to Gannon, members of the PKI Forum gain OASIS membership status and are eligible to contribute to technical work being done within the standards consortium. In turn, OASIS members can actively participate in PKI committee work..." See: (1) 2002-11-05 news item "PKI Forum Continues Security Advocacy as an OASIS PKI Member Section"; (2) "XML and Security."
[November 05, 2002] "OASIS Adds PKI Forum to Security Arsenal." By Clint Boulton. From InternetNews.com (November 4, 2002). "With security at a premium -- especially for Web services -- e-business standards group OASIS moved to bolster its ability to create safer specifications Monday when it added the PKI Forum as its newest member section. Vital for ensuring secure transactions on the Internet, PKI, short for public key infrastructure, is a system of digital certificates, certificate authorities, and other registration authorities that verify and authenticate the validity of each party involved in a Web transaction. They essentially query a user about his or her identity, and serve as gatekeepers that monitor e-commerce exchange. The PKI Forum is the main group responsible for aggregating PKI proprietors, end users and developers, and will continue in that capacity under the aegis of OASIS... OASIS President and CEO Patrick Gannon told internetnews.com aligning PKI alongside other Web services and e-business security standards at OASIS 'makes it easier for every company with a serious stake in the security agenda to be represented and involved in this work'..." See: (1) 2002-11-05 news item "PKI Forum Continues Security Advocacy as an OASIS PKI Member Section"; (2) "XML and Security."
[November 05, 2002] "Web Services Security (WSS) Core Specification." Draft (work in progress) version 03. November 3, 2002. Reference: WSS-Core-03-1103. 46 pages. Posted 2002-11-04 to the OASIS Web Services Security TC mailing list by Anthony Nadalin. See also the initial draft versions of XML schemas for the WSS Core, posted by Chris Kaler. This specification describes enhancements to the SOAP messaging to provide quality of protection through message integrity, message confidentiality, and single message authentication. These mechanisms can be used to accommodate a wide variety of security models and encryption technologies. This specification also provides a general-purpose mechanism for associating security tokens with messages. No specific type of security token is required; it is designed to be extensible (e.g. support multiple security token formats). For example, a client might provide one format for proof of identity and provide another format for proof that they have a particular business certification. Addition ally, this specification describes how to encode binary security tokens, a framework for XML- based tokens, and describes how to include opaque encrypted keys. It also includes extensibility mechanisms that can be used to further describe the characteristics of the tokens that are included with a message." XML Schemas: utility.xsd, secext-xsd. The TC website references other WSS documents. See: "Web Services Security Specification (WS-Security)."
[November 05, 2002] "OASIS-LISA Global e-Biz Survey." By Patrick Gannon (OASIS President and CEO). Summary of results from the OASIS-LISA Global eBusiness Survey. Source posted to the OASIS XML Localization Interchange File Format TC mailing list by Jonathan Clark. Slides presented at the XLIFF TC face to face meeting November 4, 2002. The subject of the survey was Global eBusiness and Web Services requirements with particular reference to Language Processing Standards. It was designed to "determine the impact of multi-lingual technologies on global e-business is the latest product of cooperation between two international organizations developing complementary standards for localization. The survey is structured in two parts: The Business Process section is for managers responsible for international product, services, sales or support operations; the Localization Expert section is designed for product, web services, development and standards professionals." See the announcement: "OASIS and LISA Collaborate on Global e-Business Survey. New Study to Determine Impact of Multi-Lingual Technologies." See "XML Localization Interchange File Format (XLIFF)." [source .PPT]
[November 05, 2002] "Look at Storage Issues Before You Leap Into XML." By Kevin Dick (Kevin Dick Associates). In Application Development Trends Volume 9, Number 11 (November 2002), pages 45-49. Adapted from Chapter 5 of XML: A Manager's Guide, Second Edition, by Kevin Dick, Addison-Wesley. ['Organizations can avoid missteps by first selecting the right storage model for a project: a DBMS, a content management system or a native XML store.'] "XML documents are data that can be either at rest or in transit. Therefore, enterprises that want to successfully deploy XML must figure out how to manage XML in both of these states. For XML at rest, developers must first decide on the type of store to use. For XML in transit, they must first decide on the server infrastructure to deploy. It is not uncommon for projects using XML to stall while figuring out how to address the storage issue. The confusion stems from the fact that there are three vastly different choices: a database management system (DBMS), a content management system (CMS), or a native XML store. The appropriate choice depends on the characteristics of your XML data. What if you use XML as a data interchange format? In this case, a source application encodes data from its own native format as XML, and a target application decodes the XML data into its own native format. XML is an intermediate data representation. Both the source and target applications already have persistent storage mechanisms, almost certainly DBMSs of one sort or another. There is really no need to store the XML documents persistently themselves, except perhaps for logging purposes..." See: "XML and Databases."
[November 05, 2002] "Structuring XML Documents." By Richard V. Dragan. In PC Magazine (November 5, 2002). "As XML standards and tools mature, the traditional way of designing new XML document types, using Document Type Definitions (DTDs), is giving way to something new and more powerful: XML Schemas. (Version 1.0 of the XML Schema specification was approved by the W3C in May 2001; Version 1.1 is currently in the works.) Certainly, no one expects DTDs to disappear any time soon, and there are good reasons to use both standards for the foreseeable future. But even if you don't design new document types yourself, you will want to be able to make sense of this new standard in order to work with today's new XML-based standards, like Web services, that increasingly make use of XML Schemas. The Document Type Definition is original equipment in XML. Based on the primordial Standard Generalized Markup Language (SGML), a DTD lets designers define the structure of a class of documents by specifying the number, kind, and order of elements and attributes in XML fields or entities; these specifications become the rules for creating a valid document based on the DTD... It should be noted that there are alternatives to defining XML documents beyond XML Schema (for example Schematron and Relax NG). But XML Schema has gotten a lot of traction as an official W3C standard. Equally important, the proliferation of XML Web services very probably means an XML solution to designing documents has a good chance of taking off. By allowing separate servers -- regardless of platform -- to communicate across the Internet in XML, Web services offer a potentially better way to make distributed computing happen. Two XML-based standards, Web Services Description Language (WSDL) and Simple Object Access Protocol (SOAP), are an integral part of Web services. A WSDL document describes the functions that can be called within a particular Web service, while SOAP packages function calls in XML, sending them to a remote server, which then sends an answer back using more XML. Naturally, each side of the conversation requires building a new document type, and the documents are based on an XML Schema, not on DTDs. Web services tools need to generate new XML data types to manage the conversation between servers, and it's only natural that they speak in XML. Whether you are working with content-based systems, getting XML documents from different organizations or divisions to coexist peacefully, or building Web services, XML Schema has a big role to play. Though mastering the technology is more complicated than that used with old-fashioned DTDs, XML Schemas can offer a more powerful and extensible way to define documents, and they are invaluable for applications that interact using XML..."
[November 04, 2002] "BEA Pours Liquid Data." By Paul Krill. In InfoWorld (November 04, 2002). "Bea Systems on Monday will unveil Liquid Data for WebLogic, a product that provides real-time access to data aggregated from multiple sources. Functioning with the company's WebLogic Server 7.0 application server, the product fits in with BEA's intentions to become a leader in enterprise integration, according to company officials... Inner workings of Liquid Data include using XML as its language for describing relationships between data, and use of XQuery as a query language, said Ajey Patel, senior director of product management in the new technologies division at BEA. To assist with accessing information from systems such as mainframes, BEA offers adapters for system connectivity. Liquid Data utilizes WebLogic Server's security, scaling, clustering, and database connection pooling features... Liquid Data provides an abstraction layer on top of data sources, according to BEA... Initially, Liquid Data can connect to database applications, Web services and XML files and access structured and unstructured data. Within the next six months, BEA expects to expand the product to include access to unstructured content from content management systems from vendors such as Documentum and Interwoven. A typical beta project with Liquid Data has taken about two to three weeks for integrating three to five data sources..." See the announcement: "New Software from BEA Greatly Simplifies Getting Fast Answers To Complex Business Questions. Available Immediately, BEA Liquid Data for WebLogic Integrates Real-Time Information Across Enterprise Applications and Data Sources."
[November 04, 2002] "XML Offshoot Aims For Accuracy. Stock Exchange Gathers Industry Insight into VRXML Initiative To Simplify Vendor Reporting." By Cristina McEachern. In InformationWeek (November 04, 2002). "The New York Stock Exchange has said it will use the Vendor Reporting Extensible Markup Language in an effort to simplify its vendor-reporting requirements. The language was created to help improve the accuracy of reporting for stock exchanges, vendors that resell market data such as stock prices and volume, and financial-services companies. The NYSE is the first exchange to set forth an XML-based standard for reporting, and it envisions extending VRXML into other areas as well. Under the current Vendor Automated Reporting System, market-data vendors send data files of their reporting requirements to TCB Data Systems, an NYSE spin-off, which processes and extracts the data, then passes the files along to the stock exchange. In general, vendor-reporting requirements include information such as customer location, product identification, and ZIP codes for state-tax calculations. As a direct-bill exchange, the NYSE also invoices the customer -- typically brokers, asset managers, or business publications -- using the data provided by the vendor... The VRXML initiative looks to simplify the vendor-reporting process by letting vendors report directly to the exchange or continue to report through the Vendor Automated Reporting System, which would convert the files into the VRXML format and pass them along. 'The NYSE is saying you need to report to us in our format, and our format is VRXML,' says Mike Atkin, VP of the software association's financial information services division. 'The exchange doesn't really care how they do that, but that will be the obligation.' He adds, though, that the stock exchange is being flexible in the transition to the VRXML format. Everybody wants accuracy to 'make it easy to meet the obligations, so the NYSE is working with the industry to make sure they can adopt it and what the transition might be,' he says... The financial information services division of the software association has sought industry feedback to make sure the draft version of the VRXML format meets user requirements and to determine if it can be extended for internal inventory management, what kind of conversion-process problems are possible, and what areas might need some work..." See: "Vendor Reporting Extensible Markup Language (VRXML)."
[November 04, 2002] "IM Compatibility Closer to Reality." By Robert Lemos. In CNet News.com (November 01, 2002). "The Internet's governing technical body quietly gave its stamp of approval Thursday to a group intent on creating an open standard for instant messaging. The Internet Engineering Task Force (IETF), the group that sets the technical standards for the Internet, gave the go-ahead to the creators of open-source instant-messaging application Jabber to create a working group based on that technology. These such groups plan the specific implementations of the technologies that make up the Internet... Called the Extensible Messaging and Presence Protocol (XMPP), the group's instant messaging standard gives Internet users hope of one day being able to send messages to anyone on the Net, no matter what software they are using. The group is also charged with adding security -- including authentication, privacy and access control -- to instant messaging, according to the group's charter. Currently, AOL Time Warner, Yahoo and Microsoft divide nearly all instant messenger users among them. Yahoo and Microsoft, as well as smaller IM clients, have in the past called on America Online to open its instant messaging system to rivals... Calls for interoperability have quieted among consumers, but business users have become more earnest in their exhortations for a single standard. In October, seven brokerage firms formally announced the Financial Services Instant Messaging Association to promote standards in the instant messenger industry... The new working group could have some competition from IBM and Microsoft, which have promoted a separate standard known as SIMPLE (SIP for Instant Messaging and Presence Leveraging Extensions) based on SIP or the Session Initiation Protocol. SIP is a way of signaling applications on the Internet to enable conferencing, telephony, presence, events notification and instant messaging. The IETF approved SIMPLE as a proposed standard in September. That technology is being developed by the Session Initiation Protocol Working Group..." See details in the 2002-11-04 news item: "IETF Charters Extensible Messaging and Presence Protocol (XMPP) Working Group", and general references in "Extensible Messaging and Presence Protocol (XMPP)."
[November 02, 2002] "WSIA - WSRP Core Specification." OASIS Working Draft 0.8. 21-October-2002. Draft "early version" of the public specification. Document identifier: Draft-WSIA-WSRP_Core_Interface-v0.8. 83 pages. From the OASIS WSIA and WSRP technical committees. Edited by Alan Kropp (Epicentric, Inc.), Carsten Leue (IBM Corporation), and Rich Thompson (IBM Corporation). Contributors include: Chris Braun (Novell), Jeff Broberg (Novell), Mark Cassidy (Netegrity), Michael Freedman (Oracle Corporation), Timothy N. Jones (CrossWeave), Thomas Schaeck (IBM Corporation), and Gil Tayar (WebCollage). "Integration of remote content and application logic into an End-User presentation has been a task requiring significant custom programming effort. Typically, vendors of aggregating applications, such as a portal, had to write special adapters for applications and content providers to accommodate the variety of different interfaces and protocols those providers used. The goal of this specification is to enable an application designer or administrator to pick from a rich choice of compliant remote content and application providers, and integrate them with just a few mouse clicks and no programming effort. This specification is a joint effort of two OASIS technical committees. Web Services for Interactive Applications (WSIA) and Web Services for Remote Portals (WSRP) aim to simplify the integration effort through a standard set of web service interfaces allowing integrating applications to quickly exploit new services as they become available. The joint authoring of these interfaces by WSRP and WSIA allows maximum reuse of user facing, interactive web services while allowing the consuming applications to access a much richer set of standardized web services. This joint standard layers on top of the existing web services stack, utilizing existing web services standards and will leverage the emerging web services standards (such as security) as they become available. The interfaces are defined using the Web Services Description Language (WSDL)..." [Source: see the posting]
[November 02, 2002] "Building Interactive Web Services with WSIA and WSRP." By Eilon Reshef (WebCollage). Draft article posted to the OASIS WSRP-WSIA list. 5 pages. ['Draft of document to be a WSJ feature. Provides background on the upcoming WSIA/WSRP standards. The article was written prior to the formal closing of the spec, so one may eventually expect minor discrepancies between the two, although the article does not go into the level of detail in which current issues are being debated. I tried to present the topics as objectively as possible, although naturally also to present my own views on the protocols and how they fit into what organizations are doing today.' - author's note in post] "Web Services for Interactive Applications (WSIA) and Web Services for Remote Portals (WSRP) are standards for user-facing, presentation-oriented, interactive Web services intended to simplify the creation of distributed interactive applications. With WSIA and WSRP, Web services include presentation and multi-page, multistep user interaction. This lets service users plug them into sites and portals without custom development and to leverage new functionality available in future versions of the service without the need for additional custom development... WSIA and WSRP stem from parallel efforts initiated in the middle of 2001. Portal vendor Epicentric spearheaded the Web Services User Interface (WSUI) initiative to address the lack of a standard user-interface layer as part of the Web services stack. IBM initiated its own effort: Web Service eXperience Language (WSXL). WebCollage, a software vendor providing a platform for integrating interactive Web applications, wanted to standardize its customer implementations based on an initiative called Interactive Web Services. In parallel, many portal vendors recognized the need to address the same problem in the context of portal toolkits: how to quickly plug remote interactive services (called "portlets") into a portal without custom programming for each remote service. The efforts were consolidated into two working groups under the umbrella of OASIS, the organization behind ebXML and other XML-related standards. WSIA focuses on the framework for creating interactive Web services, and WSRP defines the interfaces to include portal-specific features. Today the working groups include more than 30 industry-leading vendors from different industry segments: application server vendors (BEA, IBM, Oracle, Novell, Sun), pure-play portals (Epicentric, Plumtree), interactive application integration vendors (WebCollage, Kinzan, Citrix) and enterprise application providers (Peoplesoft, SAP). Version 1.0 of the specifications includes the interfaces common to the two groups, and is currently in a review process... WSIA and WSRP define a set of APIs that allow developers to produce and consume remote interactive Web services. They define three types of actors: (1) Producer: Represents the service provider hosting the remote interactive Web service (for example, weather.com as a weather service provider). (2) Consumer: Represents the entity integrating the remote service into its Web application, oftentimes using a portal toolkit (for example, Yahoo Weather or a corporate portal). (3) End User: Represents the person that comes to the Consumer Web site to use the Producer's application in the Consumer's context. In a nutshell, WSIA and WSRP fulfill the following roles: (1) Define the notion of valid markup fragments based on the existing markup languages such as HTML, XHTML, VoiceXML, cHTML, etc. (2) Provide a set of standard calls to enable a Consumer to request these markup fragments from a Producer based on the existing state. (3) Supply a set of calls that support the concept of multi-step user interaction and preservation of state across those calls. There are four central parts to the WSIA and WSRP APIs: (1) Retrieving markup fragments (encapsulated in the getMarkup() call). (2) Managing state (3) Handling user interaction -- encapsulated in the performInteraction() call; (4) Supporting customization and configuration..." Source: see the posting and mail archives URL.
[November 02, 2002] "Extending Groove." By Jon Udell. In InfoWorld (November 01, 2002). "The first release of GWS (Groove Web Services), due later this year, will wrap SOAP/WSDL interfaces around the core elements of the Groove architecture: accounts, identities, contacts, shared-space membership, and presence. It will also encapsulate the most common tools used in shared spaces: Discussion, Files, and Calendar. To export access to these Web services, a SOAP server runs alongside the Groove client. A SOAP client running on the same machine can hit these services directly. A remote client will reach them through a gateway that, like Groove's existing relay server, provides queuing and through-the-firewall connectivity. In the first release, the Groove security model will not extend to SOAP clients. Remote use of GWS will therefore be considered a developers' preview. Phase two of the project, due next year, aims to leverage the WS-Security framework to authenticate users and devices, and secure the SOAP traffic. That's also when SOAP interfaces will be provided for Groove's messaging subsystem and for more tools, including Projects, Meetings, and Forms... the first and best use of GWS will be to empower .Net programmers or Java programmers, as well as scripters using Perl, Python, and Ruby, to interact with shared spaces, combine them with other local and remote data, display Groove information to the local user, and push it out to destinations such as enterprise portals. The GDK (Groove Development Kit) is optimized for professional programmers who create polished tools that plug into the Groove transceiver or who splice Groove DNA into other commercial applications. There hasn't been an easy way to do simple things such as adding a Groove presence indicator to a directory on the company intranet. Solving that kind of problem is, on balance, an even more vital mission for GWS than downsizing Groove services for PDAs... Developers will find GWS to be a textbook example of a style that has come to be called 'RESTful SOAP.' REST (Representation State Transfer) describes the architectural style of the Web. What that means for SOAP, as recommended in the W3C working draft for SOAP 1.2, is that interfaces should 'use URIs [Universal Resource Identifiers] in a Web-architecture compatible way -- that is, as resource identifiers.'... Six months ago, at the O'Reilly Emerging Technology Conference, advocates of REST (Representational State Transfer) pointedly asked a panel of SOAP experts: 'What does Web services have to do with the Web?' The blunt answer was, to paraphrase, 'Nothing.' As we noted then ('Hyperlinks matter'), a linked web of XML documents is a potent architecture that Web services can and should exploit. Happily, GWS does..."
[November 02, 2002] "Building a Better Groove." By Stan Gibson and John McCright. In eWEEK (September 23, 2002). "['Ray Ozzie is chairman and CEO of Groove Networks Inc., a maker of peer-based collaboration software. The company launched the first version of Groove in October 2000 and is now readying Version 2.5 for shipment. Faced with a tight economy, the company is intent on proving that its software can deliver tangible value and that it can survive in a difficult environment for startups. Groove received a big boost last October when Microsoft Corp. invested $51 million in the company. Ozzie spoke with eWeek Executive Editor Stan Gibson and Department Editor John McCright at Groove's headquarters in Beverly, Mass.'] "... The Relay Server has always been a piece of the architecture because otherwise you'd have no way to work offline. Relay Server, Enterprise Integration Server and Enterprise Management Server -- those originated about the same time. They were the big push in Version 2 of Groove. But this is not centralization of the application or the data. The fourth server, which we haven't talked a lot about, is an audit server. It's key to pharmaceutical and financial services users because it maintains a verifiable audit history of what people on those clients are doing, for regulatory reasons. The audit server will be in 2003: tested in the first half of the year, gold later in the year... another [server] we're calling Web Services Access Point. Consistent with the Groove philosophy, there's no applications logic on it. The Groove client is where the applications live. Each application is called a tool set and exposes Web services on the client. The Web Services Access Point is told by the clients where they are. That is going into beta in the fourth quarter... The major things are on the infrastructure side, including continued bandwidth enhancements. It has a number of features that do more advanced topology management. Groove conserves bandwidth by figuring out an optimal network topology. It can send copies only where they need to go. Also, there's a thing called 'asymmetric files.' When you drop something into a shared space, people get it only when they need it. On the user interface side, there is Microsoft integration with SharePoint and Outlook..." Note: the Groove Online Forum moderator said of v2.5: "The release is planned for later in Q4. Groove Workspace v2.5 will offer integration with Outlook calendar and contacts. This will be an initial implementation of these features, but we do have an overall commitment to offering tight, seamless, two-way synchronization with Outlook." See also "Web Services Key to Groove 2.5."
[November 02, 2002] "W3C Close to Ratifying SOAP 1.2." By Paul Krill. In InfoWorld (November 01, 2002). "SOAP 1.2, the proposed revision of the Simple Object Access Protocol Web services specification, may be finalized by the World Wide Web Consortium (W3C) within two to three months, a W3C representative said on Friday. But intellectual property rights could hinder the process. The 393 issues raised by industry participants around the development of SOAP 1.2 since June 2001 have been narrowed down to just 11, said W3C representative Janet Daly. The W3C XML Protocol Working Group met in Bedford, Mass., October 29-31 to deliberate on SOAP 1.2. Among the remaining issues to be resolved is intellectual property, with the group believing 'there may be significant intellectual property issues with the SOAP specification,' according to the working group's statement. 'Multiple companies have said they believe they may have relevant patents. Further investigation is required here before the specification should proceed to PR [proposed recommendation] phase,' W3C said in noting the outstanding issues. Proposed recommendation is the ratification stage. Two vendors, webMethods and Epicentric, which has just been acquired by Vignette, have stated they may have possible patents pertaining to SOAP 1.2, with webMethods stating as of September 11  it is not willing to waive its patent rights and Epicentric saying it has not been given permission to make a public statement, as of the same date. There have been no changes in those statements as of Friday but that may happen next week, Daly said... Some other vendors participating in the working group, including IBM and Tibco, report in the W3C statement that they have not identified any intellectual property rights related to the SOAP 1.2 effort and thus have not declared any. Tibco, for one, said in its statement it has not identified any intellectual property rights in the current XML Protocols activity, but that it may own patents or have other intellectual property rights in this activity or may identify subsequent contributions to the W3C as containing intellectual property rights. Microsoft, which also has not declared any patents pertaining to SOAP 1.2, is granting royalty-free use of its copyrights for use the specification. Other outstanding issues in SOAP 1.2 include the scope of the specification itself, lack of both a dedicated conformance section in the specification, and a conformance clause and de-referencing of URIs (universal resource identifiers) inside SOAP. Additional issues include implementation technicalities and issues with specific technical terms and characterization of white space..." See SOAP Version 1.2 Part 1: Messaging Framework [W3C Working Draft 26-June-2002] and SOAP Version 1.2 Part 2: Adjuncts [W3C Working Draft 26-June-2002]; also from 2002-09-24 Last Call Working Draft for SOAP 1.2 Attachment Feature." The XML Protocol Working Group is part of the W3C Web Services Activity. See "Simple Object Access Protocol (SOAP)."
[November 01, 2002] "JCP [Java Community Process] 2: Process Document." Version 2.5. October 29, 2002. From the Java Community Process. Describes the formal procedures for using the Java Specification development process. From the text of the announcement: "On behalf of the members of the Java Community Process[sm] the JCP Program Management Office (PMO) today launched a new iteration of the program, JCP 2.5. The new version enhances the JCP by enabling the participation of a wider variety of contributors and by increasing the flexibility of the process by which compatible implementations of Java technology are created. As a result of the new program enhancements, JCP is evolving into a more effective forum for serving the three million Java developers and for supporting the activities of an increasingly broader membership base... The new agreement and governing rules give more freedom and equal standing to all Java community participants enabling them to implement compatible Java specifications under a license of their own choosing including open source licenses. The Java Specification Participation Agreement (JSPA) requires all Java specifications to allow for development and distribution of compatible independent implementations, make specification products available separately and offer Technology Compatibility Kits (TCK) free of charge to qualified non-profits, educational organizations and individuals. The JCP 2.5 process document focuses on the implementation of the new agreement and on the continued availability of Java APIs as part of or independent of platform specifications. 'JCP 2.5 breaks new ground by making open source licensing possible for those who work on Java specifications and those who create compatible independent implementations of the specifications,' said Jason Hunter, vice-president, Apache Software Foundation and JCP EC member. 'In addition the cost structure has been changed to allow smaller developer groups and individual developers to gain broader access to Java specifications, often times free of cost.' Going forward all new Java Specification Requests are required to operate under the new agreement..." See Java Specification Participation Agreement [Reference: JSPA - October 2002]. Related news stories from: InfoWorld, eWEEK, and InternetNews.com.
[November 01, 2002] "Structure and Design Issues for Developing, Implementing, and Maintaining a Justice XML Data Dictionary." [Edited by] Mark Kindl and John Wandelt. Report of the Justice XML Structure Task Force. Office of Justice Programs, U.S. Department of Justice. August 23, 2002. 66 pages. [Noted by David J. Roberts, Chair of the OASIS LegalXML Integrated Justice TC.] "This report addresses structure and design in implementing a Justice XML Data Dictionary that facilitates effective information exchange across a diverse professional community. It documents a starting point for, and marks the progress of the Justice XML Structure Task Force (XSTF). It does not necessarily address all issues, nor does it present the only solutions. It addresses major issues of requirements, design, structure, and implementation considered by the XSTF. This report is intended to: (1) Expose some of the important structural and design issues associated with implementing a standard XML data dictionary, an associated XML schema, and a potential registry/repository. (2) Identify some potential solutions and best practices from collective experiences encountered in other projects. (3) Identify and discuss the advantages and disadvantages of potential solutions. (4) Identify critical decision points for the Global Infrastructure/Standards Working Group (ISWG) XML Committee. (5) Recommend courses of action for these decision points. (6) Identify design and structural issues that may require further attention: research, experimentation, evaluation, discussion, review, and/or decision. What is meant by structure in the context of standards for information exchange? Fundamentally, effective information exchange requires common syntax and semantics. Syntax defines the standard rules for the arrangement of the data and information items in exchange instances. The Extensible Markup Language (XML) establishes a formal standard syntax specification. Semantics are usually recorded in a data dictionary (DD) and establish the standard meaning of the data and metadata to be exchanged. Nonetheless, even with standard XML syntax and well-defined semantics, effective communication and information exchange require structure; that is, an information model and rules that constrain syntax to ensure semantics are represented consistently and unambiguously." Document background: "This report was prepared by Mark Kindl and John Wandelt, Georgia Tech Research Institute (GTRI), on behalf of the Justice XML Structure Task Force. It is a product of the Office of Justice Programs (OJP), United States Department of Justice, and supports OJP integrated justice information sharing initiatives. The Justice XML Structure Task Force (XSTF) is a component of the Justice XML Subcommittee, which is a part of the Global Justice Information Network Advisory Committee (GAC) Infrastructure/Standards Working Group. The GAC was created to promote broad-scale sharing of critical justice information by serving as an advisory body to the Assistant Attorney General, OJP, and the United States Attorney General. The XSTF was created to build on the work of the Justice XML Subcommittee Reconciliation Data Dictionary (RDD). Research for this report was provided by GTRI. For other background documents, please visit the Web site of SEARCH, The National Consortium for Justice Information and Statistics." [source .DOC]
- XML Articles and Papers October 2002
- XML Articles and Papers September 2002
- XML Articles and Papers August 2002
- XML Articles and Papers July 2002
- XML Articles and Papers April - June, 2002
- XML Articles and Papers January - March, 2002
- XML Articles and Papers October - December, 2001
- XML Articles and Papers July - September, 2001
- XML Articles and Papers April - June, 2001
- XML Articles and Papers January - March, 2001
- XML Articles and Papers October - December, 2000
- XML Articles and Papers July - September, 2000
- XML Articles and Papers April - June, 2000
- XML Articles and Papers January - March, 2000
- XML Articles and Papers July-December, 1999
- XML Articles and Papers January-June, 1999
- XML Articles and Papers 1998
- XML Articles and Papers 1996 - 1997
- Introductory and Tutorial Articles on XML
- XML News from the Press