The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: June 30, 2003
XML Articles and Papers June 2003

XML General Articles and Papers: Surveys, Overviews, Presentations, Introductions, Announcements

Other collections with references to general and technical publications on XML:

June 2003

  • [June 30, 2003] "Gaining Perspective On Digital Identities." By Paul Desmond. In Network World (June 30, 2003). "You can build it, you can buy it, but you can't escape the need for identity management... Identity management involves dealing with individuals in an online world. Ideally, it provides a single view of every individual across IT systems throughout the organization. Experts agree that the problem is the same whether those individuals are employees, customers or business partners. The goal is to 'understand who you're working with and what they need,' says Joe Duffy, global leader of PricewaterhouseCooper's (PWC) Security and Privacy Practice... Most identity management strategies start with some form of directory services integration, says Kevin Kampman, senior consultant with Burton Group. The idea is to have a single 'authoritative source' for each piece of data. Multiple authoritative sources might be associated with the same individual, depending on the data in question. For example, the human resources application would be the authoritative source for fiduciary employee records, while Active Directory holds e-mail addresses. One option is a metadirectory, which brings a consolidated view of data in various directories in the corporation. Largely homogeneous organizations might implement an all-encompassing enterprise directory, but it's unlikely you'll ever get down to just one... With the directory in order, integrating applications can begin. Centralized access management can be implemented in a number of ways, but generally, when a user attempts to log on to a Web application, the logon request is routed to the access management engine. There the user is properly authenticated, with at least a username and password. Often some form of software-based security token that denotes the user's credentials is then passed to the application. Should the user later want to access other applications, the token can be shuttled around as necessary behind the scenes, so the user doesn't have to log on to each new application. A number of vendors, including IBM, Netegrity and Oblix, sell Web access management products that provide authorization. T. Rowe Price uses IBM Tivoli Access Manager, and CUNA Mutual uses Oblix's NetPoint. While such systems easily can hook into Web-based applications, integration with client/server applications likely will require more time-consuming and costly custom integration work..." See also the sidebar "Identity Management Tips: Eight Suggestions On How to Implement Identity Management Smoothly."

  • [June 30, 2003] "Sun Drives New Security Offering." By Dennis Fisher. In eWEEK (June 30, 2003). "Sun Microsystems Inc. and PeopleSoft Inc. are set to announce a new identity management solution that will tie human resources and other back-office systems into the IT security infrastructure. The goal: to help enterprises cut costs and maintain tighter control over who accesses their networks. The joint offering will be a standards-based solution built on Sun ONE Identity Server and use PeopleSoft's broad portfolio of products in the HR and human capital management categories. The goal is to automate and streamline the process of establishing accounts for new employees and deleting them for people leaving the company -- all the while ensuring that each person has access only to the resources to which he or she is entitled. Sun and PeopleSoft are not alone in spotting this opportunity. A smaller security vendor, M-Tech Information Technologies Inc., this week will introduce a new version of its ID-Synch software, which performs many of the same functions and includes support for a broad range of platforms and authentication methods. Version 2.0 of ID-Synch has many new features, which should enable it to compete head-on with the Sun-PeopleSoft offering, which will be announced July 9, 2003. Sun and PeopleSoft are also bringing Waveset Technologies Inc. into the fold as part of their solution. Waveset, based in Austin, Texas, will provide the core provisioning and identity management technologies that will be under the covers of the solution... M-Tech, based in Calgary, Alberta, has added a number of new capabilities to ID-Synch. The biggest addition is the automated access management feature. This enables the software to monitor a system of record, such as PeopleSoft, and look for changes in the database. For example, if an employee in the accounts payable department transfers to accounts receivable, ID-Synch will see that change in the system and automatically revoke or grant access to various applications and systems based on the user's new role. These changes are handled by an authorization workflow that then passes the requests to the system's proprietary fulfillment engine. The engine supports both SOAP (Simple Object Access Protocol) and XML and is set up as a Web service to execute the changes and adjustments that have been authorized. ID-Synch 2.0 also includes a delegated management mode -- in addition to support for centralized management -- that enables departmental or regional administrators to manage local users..." See also the M-Tech announcement: "M-Tech Announces Availability of ID-Synch v2.0. Industry's Most Flexible Access Management Solution Delivers Broadest Functionality."

  • [June 30, 2003] "Using the Jakarta Commons, Part 1." By Vikram Goyal. From O'Reilly ONJava.com (June 25, 2003). ['Ever find yourself thinking "Someone's surely solved this problem before?" That's the beauty of open source. In this first of three articles, Vikram Goyal explores the Jakarta Commons, mature and well-defined reusable Java components.'] "The Jakarta Commons is a Jakarta subproject that creates and maintains independent packages unrelated to any other framework or product. The packages are a collection of components that serve small, useful purposes in their own right, and are usually server-centric. The Commons project is divided into two parts: the Sandbox and the Commons repository. The Sandbox is a test bed for trying out ideas by the Jakarta committers. This article explains the components that make up the repository. It will show you when to use a component in each repository, where to get it, and how to use it with a basic example... Reusability is the name of the game for the Jakarta Commons project. Packages that form part of this project are conceived with the aim of making them reusable... To be really reusable, each package needs to be independent of any other bigger framework or project. Thus, each package in the Commons project is largely independent, not only from other projects, but mostly of other packages as well. Deviations exist, but mostly to the extent of using well-set APIs. For example, the Betwixt package depends on the use of XML APIs. The primary aim though, is for these packages to work straight out of the box using a set of well-defined interfaces. The brevity of most packages, however, has led to a brevity of documentation, poor maintenance, and lack of support. Some suffer from incorrect links and very sparse documentation. With most packages, you are left to figure out for yourself how to use them or, in some cases, why to use them. Hopefully this article will answer some of those questions... XML-related components include (e.g.,) Betwixt, Digester, Jelly, and JXPath. In this article, I will cover the Web-related and the Trivial categories; my next article will cover the XML-related and Packages categories. The final article will describe the components in the Utilities category... The reason the CLI, Discovery, Lang, and Collections packages are categorized [here] as Trivial is because they each serve a very small, yet very useful purpose... In the next installment of this article, I will cover the XML and Packages categories. The final installment will cover the Utilities package. Note: Jakarta Commons is different from Apache Commons; the latter is a top-level project of the Apache Software Foundation..."

  • [June 30, 2003] "FTP Transport for Secure Peer-to-Peer Business Data Interchange over the Internet." By Terry Harding and Richard Scott (Cyclone Commerce). IETF EDIINT Working Group, Internet Draft. Reference: 'draft-ietf-ediint-as3-00.txt'. March 2003, expires October 2003. "This document describes how to exchange structured business data securely using FTP transfer for XML, Binary, Electronic Data Interchange, (EDI - either the American Standards Committee X12 or UN/EDIFACT, Electronic Data Interchange for Administration, Commerce and Transport) or other data describable in MIME used for business to business data interchange. The data is packaged using standard MIME content-types. Authentication and privacy are obtained by using Cryptographic Message Syntax (S/MIME) security body parts. Authenticated acknowledgements make use of multipart/signed replies to the original HTTP message. An FTP upload operation is used to send appropriately packaged EDI, XML, or other business data. The receiving application will poll the FTP server for inbound messages, unpackage and handle the message data and to generate a reply for the originator that contains a message disposition acknowledgement within a multipart/report that is signed or unsigned. This request/reply transactional interchange provides secure, reliable, and authenticated transport for EDI or other business data using FTP. The security protocols and structures used also support auditable records of these transmissions, acknowledgements, and authentication... The purpose of these specifications is to ensure interoperability between B2B Electronic Commerce user agents, invoking some or all of the commonly expected security features. This document is also NOT limited to strict EDI use, but applies to any electronic commerce application where business data needs to be exchanged over the Internet in a secure manner..." [IETF source]

  • [June 30, 2003] "Management Standards: Keeping an Open Mind." By Denise Dubie. In Network World (June 30, 2003). "Customer demand for open software is driving countless vendors, such as HP, IBM and Microsoft, to work more closely with industry organizations to develop common protocols, languages and industry standards for network and systems management. Management standards, such as SNMP and Common Information Model (CIM) -- now in Version 3 and 2.7, respectively -- came into being years ago. Yet widespread excitement over standards work has remained lackluster - until recently. In the past two years, the poor high-tech economy and demand for new Web-based technologies caused the Distributed Management Task Force (DMTF) and its peer standards groups to stir the pot a bit and start working on multipurpose standards that could help network executives get control of today's distributed applications. Several industry organizations work with corporate end users as well as their member vendor companies and developers to create a means to an end - the end being open, interoperable and manageable information systems. 'Many technologies can make up a standard, and there are a lot of variables we need to consider,' says Patrick Gannon, president and CEO of the Organization for the Advancement of Structured Information Standards (OASIS). 'But ever since the dot-com bust, end users have been more particular about where they spend their money, and using standards-based products can increase the value of their current and future IT investments'... Microsoft, IBM and VeriSign last year published a proprietary specification for Web services specifically, while OASIS continues to work on an open standard for Web services security. The Global XML Web Services Architecture is a framework that Microsoft and IBM are developing (along with BEA Systems, RSA Security, SAP and VeriSign), to give Web services higher-level abilities for security and reliability. Last summer some 17 vendors submitted the code-named Bluefin specification to the Storage Network Industry Association (SNIA), which is expected to announce Version 1.0 of the newly named Storage Manage Initiative Specification (SMI-S) next month. SMI-S proposed to give storage customers a way to manage multiple storage appliances from different vendors. Before the proposed specification, enterprise storage managers would have had to manage each storage appliance with vendor-specific tools and work to integrate the disparate information manually. 'Customers want interoperability, and vendors on their own can't deliver that,' says Ray Dunn, marketing manager for SNIA's Storage Management Forum. 'Today, vendors are willing to work together to create a baseline for interoperability and then add their differentiation on top for a competitive advantage.' But unlike one of the oldest -- and probably most ubiquitous -- management standards, SNMP, the specifications under development today can't be called simple by any means..." See related references in: (1) "DMTF Common Information Model (CIM)"; (2) "XML-Based Provisioning Services"; (3) "Management Protocol Specification."

  • [June 30, 2003] "New Java Aims to Simplify." By Stephen Shankland. In CNET News.com (June 30, 2003). "Sun Microsystems released a new version of its Java for desktop computers on Friday that aims to make the software faster, more familiar in appearance, and less daunting for nonprogrammers. Among other changes, the new version 1.4.2 of Java 2 Standard Edition will include buttons, menu bars, and other graphical elements that match the feel of Windows XP or the Gnome interface to Linux. Version 1.4.2 also offers a new control panel, an automatic update feature, and a swifter response when taking actions such as displaying a list of files stored on a hard disk. The friendlier interface is part of an effort by Java inventor Sun to make Java software that average consumers will recognize and demand. Earlier this month at its JavaOne trade show, Sun announced a multimillion-dollar branding campaign to try to etch the Java logo and value into consumers' minds... At JavaOne, Sun said version 1.4.2 was imminent. Its importance has increased with the announcement that the top two PC makers, Dell Computer and Hewlett-Packard, will bundle Java on their laptops and desktops. Those deals have taken on added significance for Sun since a court decision last week that overturned a previous requirement that Microsoft include Java with Windows. Java lets a program run on many types of computers, such as those running Mac OS, Windows or Linux, without having to be changed for each one. That universality could undermine the power of Microsoft, whose Windows operating system is the foundation for most desktop computer software. Though Java initially was released as a desktop computing software product, its successes to date have been in areas where Microsoft isn't as strong: servers and cell phones. To tout Java's advantages, Sun established a new Java.com Web site..."

  • [June 30, 2003] "Web-based XML Editing with W3C XML Schema and XSLT, Part 2." By Ali Mesbah and Arjan Vermeij. From XML.com (June 25, 2003). ['Generating instance document-editing GUIs from schemas is a commonly requested feature; this week we have a follow-up to Ali Mesbah's first article about using XSLT and W3C XML schema to create HTML GUIs for editing. This latest installment covers what was missed from the first part: how to create a new document from scratch, and how to add elements into a document.'] "In an earlier article we talked about an approach to automatic, form-based GUI generation based on XML Schema -- an approach which uses a single XSLT stylesheet, through which editing XML instance documents is made possible. Open issues remained as how to add new elements to an instance document and how to create an initial instance of our schema. One of the most common uses for schemas is to verify that an XML document is valid according to a set of rules. A schema can be used to give the order in which elements must appear or their type. We are also able to define the cardinality of the elements in the schema. This information can be used to insert (or delete) elements to (or from) an instance document while keeping the instance document valid... This current article describes a methodology by which elements can be inserted into an XML instance document through an auto-generated, form-based GUI, based on the XML Schema of the instance document and XSLT. The capability of editing and inserting or removing elements using the corresponding XML Schema makes it a complete and functional approach for the implementation of an XML web-based editor..."

  • [June 30, 2003] "Finding IDs." By John E. Simpson. From XML.com (June 25, 2003). ['In this installment of his monthly XML Q&A column John explains how to use XPath with XML IDs.'] "Q: 'Is there a syntax for [addressing ID attribute values] in XPath 1 or one being considered for XPath 2.0?' A: "Instead of a simple 'named anchor'-style selection, use the id() function to locate the element in question. It takes one argument, the ID value(s) you're looking for... if the ID attributes are undeclared you can still locate the right element, assuming no two elements share the same ID value... if the argument is a node-set, the id() function behaves quite differently. Rather than returning a single node, it returns a node-set containing all element nodes whose ID-type attributes match any of the string-values of nodes in the passed node-set. Thus the id() function can actually locate more than one node, which seems to be a contradiction... There's more than one way to obtain these results. Instead of using the id() function, for instance, you could use keys to locate the desired nodes. This is absolutely the way to go if the attributes in question aren't ID-type attributes in the first place. See Bob DuCharme's 'Declaring Keys and Performing Lookups' for more details.) Still, if you've got ID-type attributes you might as well take advantage of their uniqueness..."

  • [June 30, 2003] "How (Not) to Grow a Technology." By Kendall Grant Clark. From XML.com (June 25, 2003). ['Kendall Clark ponders how to grow a technology, especially an XML one. Do you opt for death-by-committee, or obscurity via community chaos? Into this discussion, Kendall covers the new community initiative afoot to throw away the troubled RSS specifications and reinvent one which has true consensus.'] "RSS -- in its endless variations and versions, both preceding and following version 1.0 -- has been the subject of interminable and painfully annoying debates and shouting matches, during which various rhetorical combatants have (or have not, depending on who you believe) muttered death threats. In other words, 'the market' can be a very hostile, ugly, and cruel way to do anything, much less to evolve a technology or grow a technology standard. [Roger] Costello's choice of RSS 1.0 has been made even more interesting by what appears to be yet another attempt by 'the market' to throw out RSS, and its bathwater, and to start over afresh and anew. Though the new effort doesn't yet, as of this writing, have an official name, I'm putting my money on 'Echo'. 'The market' has a curious habit of starting over from scratch from time to time, and the only consolation that it offers to people who've built various kinds of cottage industry around the 'old way of doing things' -- industries which are toppled by the sudden urge to create de novo -- is that maybe the new way will be better. But probably it will just be different. Echo seems likely to be a mixture of both the 'just different' and the 'better', with a few dashes of vendor-imposed but politically necessary perversity thrown in for good measure... Whatever else can be said about RSS, in which ever version you care about or hate most, it's been for the most part a very grassroots effort. And that's been one of the things about RSS which I liked the most. I'm not alone in being attracted to RSS because of its grassroots character... Except for the earliest Netscape versions, RSS hasn't been something that the Really Big tech companies cared about or even noticed -- and, let's face it, despite apperances to the contrary, in the early days of the Web, most of us didn't even think of Netscape as a corporation at all. Userland Software has been involved in RSS from early days, but Userland is a tiny speck compared to corporate behemoths like Microsoft, IBM, Sun. But with the 'rise of weblogs' (please, don't get me started), together with the relatively sudden appearance and then flourishing of RSS newsreaders as HTML browser semi-competitors (or competing supplements, if you prefer), suddenly there are all sorts of noises and rumblings from the Really Bigs that RSS has been noticed. And that can't be a very good thing in the long term for most of the toolmakers and cottage builders who cluster around RSS. It certainly isn't a good thing for lookers-on and users, like me, who like RSS because it is a grassroots affair..." On the RSS/Echo topic, see references in Tim Bray's ongoing blog entry "I Like Pie." General references in "RDF Site Summary (RSS)."

  • [June 30, 2003] "Software Patent Vote Delayed." By Matthew Broersma. In CNET News.com (June 30, 2003). "The European Parliament, facing mounting controversy over U.S.-style software-patenting legislation, has delayed a key vote until September. A Monday vote on a controversial software patents proposal in the European Parliament has been put back until September, amid criticism that the legislation would institute a U.S.-style patent regime that would be detrimental to European small businesses and open-source software developers. The proposed software-patenting legislation is the result of a European Commission effort to clarify patenting rules as they apply to 'computer-implemented inventions,' a term that includes software. The patent offices of various EU member states currently have different criteria for accepting the validity of software-related patents, a situation that the commission's proposal aims to remedy. However, opponents of the suggested legislation charge that its ambiguity would effectively allow most software to be patented, a situation that currently exists in the United States. Critics have compared that situation to allowing a monopoly on the ideas in novels... The Foundation for a Free Information Infrastructure, a software developer lobbying group, hailed the delay as a victory for the democratic process..." General references in "Patents and Open Standards."

  • [June 30, 2003] "Interview: Java Apps Revolve Around Sun. Sun's Rich Green, Jeff Anders Discuss Project Relator, JavaFirst." By Mark Jones and Jeff Anders. In InfoWorld (June 29, 2003). "From Project Relator to an internal program called JavaFirst, Sun believes it's time developers started treating mobile devices like real computers. Mark Jones, InfoWorld's executive news editor, spoke with Sun's Rich Green, vice president of developer tools, and Jeff Anders, group marketing manager, at June's JavaOne show." [Green:] "I think massive steps are being taken between the Java community, WSI, and others. All the JAX-RPC (Java API for XML-Based Remote Procedure Calls) stuff, all the J2EE 1.4 functionality, [and] Web services are probably the single most powerful bit of glue and protocol capability to build services-based systems. The WSI components, the WSI basic profile -- which is also going to be part of 1.4 -- add a lot more in terms of XML standardization and protocols for that. What we're going to do is keep evolving the Java platform and the interface model to deal with that. And in some respects, Java is further ahead because Java knows about transactions. Java really knows about security. Now it's possible through the EJB container model to build highly available applications just by enabling that at the container model. The question is, are there standards to express those between services? They're not all there yet. That's a Web services thing and not a Java thing. So you can build services-based architectures extremely effectively now with pure Java, not even using SOAP and XML, because of the fundamental capabilities of the platform and the protocols that it supports. But as the world converges on Web services, Web services has to be evolved to be able to express all of these notions that are there in the Java platform between Web services, regardless of the implementation technique... The whole point of JavaFirst is taking all these Web services systems and if you want to sort of squint and look into the future, it's about full services-based architectures. We want to make them available to mobile devices... So you have a big Web services system, you run the tool against it, and what you get is essentially a port that supports all these services. And you get a client that you download onto your mobile device that can talk to it. And then you just attach the interface to it and you're done..." [Anders:] "We have what we call the Sun ONE Studio Mobile Edition, which is a development environment for developers who want to build MID-P midlets, and that's really just a collection of plug-ins or modules that go into our Sun ONE Studio product... [In] Project Relator there's the ability to take something like a rich client and pick any of your favorite design tools to create sort of a graphic: Adobe, Macromedia, Illustrator. It allows you to take that tool and essentially create hot spots... What Relator allows you to do is create the hot spots and then tie that to pieces of Java code on the back end that, when you mouse over it or you click on it, then invokes an EJB or does whatever action you've specified on the back end. So that's one thing that we're doing to bring together the front-end client to something on the back end... In Javon the goal is to tie together the front-end client with the back end, to a J2EE type of server..."

  • [June 30, 2003] "The Future of Mozilla Application Development." By David Boswell and Brian King. From O'Reilly Mozilla DevCenter (June 27, 2003). ['Recently, mozilla.org announced a major update to its development roadmap. Some of the changes in the new document represent a fundamental shift in the direction and goals of the Mozilla community. In this article, David Boswell and Brian King analyze the new roadmap, and demonstrate how to convert an existing XPFE-based application into an application that uses the new XUL toolkit. David and Brian are the authors of O'Reilly's Creating Applications with Mozilla.'] "The biggest news in the roadmap update is that mozilla.org intends to stop development of the application suite it currently produces in favor of stand-alone applications. When Netscape created the Mozilla open source community, it released the code for its Communicator browser suite, which included a web browser, a mail and news client, an HTML editor, and an address book. Over the past five years the community has rewritten the code base and has added many new features and other applications to the suite. The community itself has changed over this time and producing a single monolithic set of applications is no longer its main goal. In place of the browser suite, development will focus on a stand-alone Mozilla Browser (based on the Mozilla Firebird project, formerly called 'Project Phoenix') and a Mozilla Mail application (based on the Thunderbird project, formerly called 'Project Minotaur'). Both of these applications represent a second generation of Mozilla application development... Developers who have been using Mozilla as a platform for creating their own XUL-based applications or add-ons will also have some changes to get used to. Although there will most likely be some pain in the transition, the changes in the roadmap represent positive steps that mozilla.org is taking to encourage Mozilla application development. The new roadmap is a strong endorsement of the concept of XUL and of Mozilla application development in general. There is a possibility for confusion when you take a closer look at what exactly the roadmap is talking about changing. In the transition from Mozilla as an application suite to Mozilla as a platform for application development, the specific XUL-based toolkit that has been in use to date is being replaced with a new XUL-based toolkit taken from the Phoenix project. XUL itself is not being replaced though, just upgraded... [This article] provides details about how the xFly application has been converted from the old toolkit to the new one. There is good news and there is bad news about the work involved in the transition from a Mozilla package to a Firebird extension. The good news is that the basic architecture of the application does not change -- XUL is still used in conjunction with other technologies such as JavaScript, CSS and XBL. The bad news is that there are some changes to be aware of -- there are minimal changes in XUL and XBL, but the biggest changes involve how things are packaged, registered and launched..." See also "Extensible User Interface Language (XUL)."

  • [June 30, 2003] "Architecting Security for Web Services." By Mark O'Neill (CTO, Vordel Limited). In JavaPro Magazine [Fawcette Technical Publications] Volume 7, Number 8 (August 2003). ['Take a look at the security challenges of Web services and how to address them with security architecture, including what it can offer going forward when XML traverses firewalls.'] "... There are three broad architectural options for Web services security: XML Gateway, Interceptor, and custom coded. XML Gateways (sometimes called XML firewalls or XML proxies) are software packages or appliances that filter XML traffic upstream of a Web service and block unauthorized traffic before it can reach a protected Web service. An XML Gateway enforces access control rules by processing security tokens contained within incoming SOAP messages, and by ensuring that the XML format and content is appropriate for the target. It may use SAML to establish the authentication status of an end user or to request attribute information, which is used to make an access-control decision. It is important that XML Gateways contain security adapters to existing security technology such as LDAP directories, traditional firewalls, and PKI. Otherwise, the overhead of rekeying rules and user profiles into an XML Gateway would be prohibitive. As much as possible, an XML Gateway should reuse security infrastructure that has preconfigured users, groups, and roles. A single XML Gateway can protect many Web services platforms. An XML Gateway does not have to share an operating system with the target Web service, because they are on separate host machines. This separation means that a Linux-based XML Gateway appliance can protect a mixture of .Net-based and Java 2 Platform, Enterprise Edition (J2EE)-based Web services. XML Gateways often perform transformation functionality as well as security. In the XML world, this feature amounts to implementing XSLT. An alternative architectural option is to deploy Interceptors on Web services platforms. These lightweight Interceptors make use of platform-specific hooks such as ISAPI Filters, JAX-RPC handlers, or Apache Axis handlers. The Interceptor (sometimes called an 'agent') then communicates with a centralized XML security server, which performs the processing of security rules. This architecture puts the security enforcement closer to the Web services application, when compared with the XML Gateway option. It also centralizes security processing just like in the XML Gateway scenario, meaning that the Web services application does not get bogged down with processor-intensive functionality such as cryptography. Also, like the XML Gateway architecture, the Interceptor architecture provides a central point of management and reporting for SOAP traffic. The Interceptor architecture differs from the Gateway architecture security in that security is enforced at the Web service end point itself, rather than requiring XML traffic to travel through an infrastructural device to be secured... Whatever solution is chosen, it's important that it is not limited to first-generation, inside-the-enterprise Web services and can also be applied to B2B Web services when the appropriate time comes. Inside-the-enterprise deployments may be useful for learning about Web services, but if a different security solution is to be used for B2B Web services, then the learning is in vain. Web services present novel security issues, but new industry standards and new XML security products are addressing these issues. There are architectural choices to be made when rolling out XML security, and making these decisions sooner rather than later is important..." [alt URL]

  • [June 30, 2003] "Producing Multiple Outputs from an XSL Transformation." By Oleg Tkachenko (Software Engineer, MultiConn International Ltd). In [Microsoft] Extreme XML (June 23, 2003). 17 pages. ['Oleg Tkachenko explains how you can postprocess XSL transformation results into multiple documents using the XslTransform and XmlTextWriter classes in the .NET Framework. Accompanying source requires that the Microsoft .NET Framework 1.0 be installed.'] "The W3C XSL Working group has created a requirement that the next version of XSLT language would support multiple output documents and accordingly the W3C XSLT 1.1 Working Draft (officially frozen) has introduced a notion of a subsidiary result document in order to meet this requirement. W3C XSLT 2.0 Working Draft has gone even further -- the concept of a principal and subsidiary result documents has been removed and all possible result documents have been given equal status. So although the story sounds promising for future versions of XSLT, current users of XSLT 1.0, such as users of the System.Xml.Xsl.XslTransform class or MSXML, do not have access to this functionality. To workaround this lack of functionality, a number of alternate solutions are usually recommended in XSLT-related MSDN Newsgroups including: (1) Preprocessing of the transformation input by breaking it into chunks and performing the transformation on each chunk in turnl (2) Creating result-tree fragments and writing them aside using extension functions or extension objects; (3) Postprocessing of the transformation result by splitting it into chunks and writing each chunk as a separate document... In this article, I'll show how the latter solution can be easily and effectively implemented in the .NET Framework using the XslTransform class and a customized XmlTextWriter class. Postprocessing the Result of XSL Transformation: The idea behind implementing multiple outputs by postprocessing is to introduce an additional custom layer, where further transformation occurs between the XSLT processor and consumer of the XSL Transformation results. This additional layer is where the multiple output logic is implemented. Let's name that layer the redirecting layer. This way the XSLT processor still produces a single result document, which contains both main result document and optional subsidiary result documents marked as such by some redirecting instructions. The redirecting instructions are elements that indicate the subsidiary result documents, as well as define where and how to redirect them. The redirecting layer passes the main result document further untouched, but redirects subsidiary result documents to the specified destination according to redirecting instructions... the demonstrated approach also has some additional virtues. First of all, it shares multiple output semantics with XSLT 1.1 and XSLT 2.0 Working Drafts, as well as with other XSLT processors, which supports multiple output through an extension element. This effectively means a solution based on such implementation can be easily ported to another XSLT processor and that it will be compatible with future XSLT 2.0 models. Moreover, even now such a solution may benefit from using the exsl:document element as defined by the EXSLT initiative , being compatible with all XSLT processors, supporting exsl:document extension element..."

  • [June 30, 2003] "Unlocking Information in Microsoft Office 2003 Using XML." By Chris Kunicki and Charles Maxson (OfficeZealot.com), and Frank C. Rice (Microsoft Corporation). Posted June 30, 2003; created May 30, 2003. 17 pages. ['Learn about a sample Web application built using ASP.NET that demonstrates how Microsoft Office documents can be processed and used external to the application that created them. The article provides a technical review of the Unlocking Information sample solution which comes with the Microsoft Office 2003 Beta 2 Content Development Kit (CDK). The sample is a Web-based application built using ASP.NET to show how Microsoft Office 2003 documents can be processed externally of their respective applications. Additionally, the sample demonstrates how Office documents can be accessed or repurposed for uses that range beyond Microsoft Office.'] "... Since many companies share Word and Excel documents with their client-base, Office is the common tool of choice to produce the documents. Additionally, many people need to generate and collaborate using these documents, so centralizing the application on a Web server makes the most sense. In the past, that meant manipulating Word and Excel on the server which was possible but less than optimal for many reasons. Now, with the XML capabilities and native file formats offered by Office 2003, developers can unlock Office data through a variety of methods. In the case of the Unlocking Information sample application, documents that have been created with Office 2003 are saved in an XML file format and loaded onto a Web-server where they await processing. When a user makes a request for the documents, the application loads the files as XML, prepares them on the server by inserting line-of-business and user driven data and saves them out as XML files adhering to the XSD Schema requirements that Word and Excel have defined. Upon completion, the user can request to view and work with those files in Word or Excel directly from the server, just as you would expect, even though they are still XML based. Whats more, since the Office documents simply reside in open standard XML files on the server, the user doesnt even require Office to view or edit the content Behind the scenes, the application begins a procedure where it loads each of the Office-created XML files as an in-memory XML Document object. In turn, each of the XML Documents are individually processed to insert the user provided data by using XPath to query to the appropriate nodes within the XML Document. The application contains several different routines designed with specific XPath queries that target individual elements within Office XML documents. These routines include logic to populate items such as the Document Properties, Defined Ranges, Bookmarks and Mapped XML Ranges found inside the XML Documents..."

  • [June 30, 2003] "BizTalk Server 2004 Stakes New Territory." By Jim Rapoza. In eWEEK (June 30, 2003). "The core piece of Microsoft's forthcoming Jupiter e-business suite, BizTalk Server 2004, looks like it will also be one of the premier business process management platforms available. Microsoft's upgrade offers many significant improvements, including finally supporting the XML Schema standard, which should expedite business process integration. eWEEK reviews foundf very good support for XML and Web services standards, with excellent design and mapping tools. The BizTalk Server developer tools now require Visual Studio .Net, which can be confusing at first... Probably the biggest, most positive change is that BizTalk Server 2004 will be based completely on the World Wide Web Consortium's XSD (XML Schema Definition). This means that companies using BizTalk Server for business integration should find it much easier to integrate with partners, Web services and other business process management systems. Microsoft has also made major changes in the BizTalk Server tools -- the biggest being that almost all the tools are now integrated into Visual Studio .Net. While this might be a little confusing for some users at first, it makes sense because the developers most likely to use these tools are probably already using Visual Studio .Net. Also, while these tools have been placed inside Visual Studio, they are still mainly the same, so developers shouldn't have to do a lot of retraining. One new tool that is integrated into Visual Studio .Net is the Pipeline Designer. This tool provided us with a drag-and-drop interface for building business process assemblies of how applications and components would react within an integration process. Also, a new Business Rules application made it possible to define rules for how processes would react to dynamically changing conditions. It is now easier for nondevelopers to use tools such as Visio for initial orchestration design and then have a developer build the orchestration using the integrated tools in Visual Studio .Net. Also, some business forms and processes can be built using InfoPath and then imported into BizTalk Server. InfoPath can also be used to deliver content to business workers. Other new features include good monitoring and reporting capabilities, which made it possible for us to track how our business process transactions were running and to test and debug our processes... In addition to its support for XSD, BizTalk Server 2004 also has good support for several other XML and Web services standards, including XSLT (Extensible Stylesheet Language Transformations), WS-I (Web Services-Interoperability) and BPEL (Business Process Execution Language). Of course, Web services is still a big part of what BizTalk Server does, both in managing and consuming them, and the new Visual Studio integration made it simple to add Web services to our business processes..." See the BizTalk Server 2004 Beta website.

  • [June 30, 2003] "Smooth Talkers: Speech Integration Technology Gives Customers and Employees Convenient Access to Back-End Data." By John Edwards. In CIO Magazine (July 2003). Emerging Technology. "Enterprises looking into speech integration face two basic speech technology choices. The oldest and simplest type of speech integration ('directed dialogue' products) prompts callers with a series of questions and recognizes only a limited number of responses, such as 'yes' and 'no,' specific names and numbers. A new and more sophisticated approach -- 'natural language' -- to speech integration handles complete sentences and aims to engage callers in lifelike banter with a virtual call center agent. The technology is also more forgiving of word usage. Directed dialogue tools, while less expensive than natural language systems, suffer from their limited recognition capabilities. As a result, they are mostly used for simple applications, such as automated switchboard attendants or credit card activators... A pair of emerging technologies -- VoiceXML and Speech Application Language Tags (SALT) -- are also helping to advance voice integration. Both specifications rely on Web technology to make it easier to develop and deploy speech integration applications. VoiceXML is an XML extension for creating telephone-based, speech-user interfaces. The specification lets developers create directed dialogue speech systems that recognize specific words and phrases, such as names and numbers. That style of interface is well suited to callers who have no screen from which to select options. SALT, on the other hand, provides extensions to commonly used Web-based markup languages, principally HTML and XHTML. It makes such applications accessible from GUI-based devices, including PCs and PDAs. A user, for example, might click on an icon and say, 'Show me the flights from San Francisco to Boston after 7 p.m. on Saturday,' and the browser will display the flights. Both specifications aim to help developers create speech interfaces using familiar techniques. 'You don't have to reinvent the wheel and program a new interface to get speech recognition access to your data,' says Brian Strachman, a speech recognition analyst at technology research company In-Stat/MDR..."

  • [June 30, 2003] "Memorandum for Multi-Domain PKI Interoperability." By Masaki Shimaoka (ECOM Trust.net Co., Ltd; Japan PKI Forum). IETF Network Working Group. Request for Comments, Draft. Reference: 'draft-shimaoka-multidomain-pki-00.txt'. June 2003. 14 pages. "This memo is used to share the awareness necessary to deployment of multi-domain PKI. Scope of this memo is to establish trust relationship and interoperability between plural PKI domains. Both single-domain PKI and multi-domain PKI are established by the trust relationships between CAs. Typical and primitive PKI models are specified as single-domain PKI. Multi-domain PKI established by plural single-domsin PKI is categorized as multi-trust point model and single-trust point model. Multi-trust point model is based on trust list model, and single-trust point model is based on cross-certification..." See general resources refereced from OASIS PKI Member Section website. [IETF source]

  • [June 30, 2003] "SPML Eases Information Exchange." By Darran Rolls (Waveset Technologies, Inc). In Network World (June 30, 2003). "Provisioning is the process of managing the allocation of system resources to employees, partners and contractors as part of identity management... Service Provisioning Markup Language (SPML) is an XML-based framework for exchanging user, resource and service provisioning information between organizations. The framework is expected to establish an open, standard protocol for the integration and interoperability of service provisioning requests. Developed by the OASIS Provisioning Technical Service Committee (PTSC), SPML 1.0 is slated for ratification in summer [2003]. PTSC interprets provisioning to mean the upfront preparation of IT system materials or supplies required to carry out pre-defined business activities. The committee goes beyond the initial contingency of providing resources to encompass the entire life-cycle management of these resources. This includes provisioning of digital services such as user accounts and access privileges on systems, networks and applications, as well as the provisioning of non-digital or physical resources such as cell phones and credit cards. The sole purpose of a provisioning service in a network is to execute and manage provisioning requests. A given requesting authority, or client, sends the provisioning service a set of requests via a well-formed SPML document (an XML document that conforms to the SPML standard). Based on a pre-defined service execution model, the provisioning service takes the operations specified within the SPML document and executes provisioning actions on a pre-defined set of service targets or resources. The general model for SPML is one in which clients perform protocol operations on servers. In this model, a client issues an SPML request describing the operation to be performed at a given service point or endpoint. The service point is then responsible for performing the necessary operations to implement the request. Once the operation is complete, the service point sends the client an SPML response detailing results or errors... As more infrastructure becomes identity-centric and companies start to build and deploy Web services, SPML will be a critical element of an end-to-end standards-based identity management strategy..." See: (1) "OASIS Member Companies Host SPML Identity Management Interoperability Event"; (2) general references in "XML-Based Provisioning Services."

  • [June 30, 2003] "The Open Applications Group Integration Specification. OAGIS is a Practical Use of XML to Enable Integration." By Michael Rowell (Chief Architect, Open Applications Group). From IBM developerWorks, XML zone. June 30, 2003. ['The Open Applications Group Integration Specification (OAGIS) is an effort to provide a canonical business language for information integration. It uses XML as the common alphabet for defining business messages, and for identifying business processes (scenarios) that allow businesses and business applications to communicate. Not only is OAGIS the most complete set of XML business messages currently available, but it also accommodates the additional requirements of specific industries by partnering with various vertical industry groups.'] "OAGIS provides the definition of business messages in the form of Business Object Documents (BODs) and example business scenarios that provide example usages of the BODs. The business scenarios identify the business applications and components being integrated and the BODs that are used. The current release, OAGIS 8.0, includes 200 business messages and 61 business scenarios that can be used to integrate business applications... First and foremost, everything in OAGIS begins with the business process. OAGIS currently includes 61 business processes, which provide examples that show what is possible using the standard. Likewise, when you need to integrate businesses or applications using OAGIS the first place to start is with the business scenarios which can help you find the solution that most closely matches your needs. These business scenarios provided by OAGIS were used to define OAGIS and are provided as examples to help the user understand how to work with OAGIS. They identify the business applications and components that are being integrated along with the BODs used to pass information. The business scenarios also capture the sequence in which the messages are intended to occur, the dependencies, the scope, and the error handling that has been addressed. OAGi provides these example scenarios as a starting point for any new implementation... OAGIS 8.1 will be released this summer [2003], and will include: 70 business scenarios for integration, More than 400 BODs, 70 nouns, support for ebXML's CoreComponent Type 1.90, and OAGIS's submission to the UN/CEFACT CoreComponent Harmonization committee. In addition, OAGi will provide direction on how to enable OAGIS through Web services by providing guidelines and WSDL that can be used to develop your own Web services. OAGi has already publicly announced that it will support harmonized components that result from the UN/CEFACT CoreComponent Harmonization work group. OAGi has been enabling integration for a long time. OAGIS currently includes 61 integration scenarios and more are coming. OAGIS provides a standard canonical business language that enables streamlined communication between businesses and/or business applications..." Article also in PDF format. General references in "Open Applications Group."

  • [June 30, 2003] "Microsoft to Discuss ID Management Plans." By Paul Roberts. In InfoWorld (June 30, 2003). "Microsoft will be making announcements about its strategy for managing user identities this week that could well end speculation about its plans for implementing federated identity technology into its products... A spokeswoman for Oblix said that the company would be 'part of [the] plan' Microsoft announces on Wednesday. Microsoft has long-standing relationships with independent software vendors (ISVs) like Oblix and OpenNetworks Technologies Inc. The company calls on Oblix's NetPoint and OpenNetworks DirectorySmart to tie Windows networks using Microsoft's Active Directory service to other non-Windows directory systems that rely on user authentication technology such as Kerberos, according to John Pescatore, an analyst at Gartner. At stake may be the future of Microsoft's 'TrustBridge' federated identity technology. Microsoft announced TrustBridge just over a year ago, saying that the new technology would enable businesses using Windows to share user identity information and interoperate across heterogenous environments using Web services protocols such as Kerberos and SOAP (Simple Object Access Protocol). The technology was supposed to be released in 2003, but was left out of Windows Server 2003 and Microsoft has had little to say about its status... Among other things, Microsoft needs to clarify its intentions regarding the adoption of SAML (Security Assertion Markup Language), the XML-based authentication framework. The company backed XRML (Extensible Rights Markup Language) for access control, but will need to support SAML as well to be fully interoperable with non-Windows environments, Pescatore said. Pescatore anticipates that Microsoft will probably offer new guidance on the TrustBridge initiative, perhaps fleshing the technology out or providing clearer benchmarks for its identity management strategy. Few companies are clamoring for the cross-enterprise, federated identity systems that TrustBridge, .Net Passport or the Liberty Alliance are promising, according to Pescatore... Despite the lack of demand, however, Microsoft and its adversaries in the Liberty Alliance are still jockeying for control of the identity management space, Pescatore said..."

  • [June 30, 2003] "Web Services Taking Root in the Enterprise." By Peter Coffee. In eWEEK (June 30, 2003). "eWEEK Labs recently conducted a roundtable discussion on Web services. Moderated by Technology Editor Peter Coffee, the roundtable comprised key players in several areas of the Web services space. The discussion centered on where Web services are as a technology, where the industry is in achieving effective consensus on standards, and how the business models are developing for using those standards and technologies." Roundtable participants included: Gregg Bjork (Systinet Corp.), Neil Charney (Microsoft Corp.), Ted Farrell (Oracle Corp.), Tyson Hartman, Karla Norsworthy (IBM), Benjamin Renaud (BEA Systems Inc.), and Gordon Van Huizen (Sonic Software Corp). What's the wrong way to think about web services? [1] It's not the platform: Neither Java nor .Net makes it easy. [2] It's not the tools and technologies: Services must be aligned with business models. [3] It's not rip and replace: Services should extend and assist in integrating current applications. [4] It's not a future scenario: Specifications, tools and opportunities are available now..."

  • [June 30, 2003] "XML and Content Management Systems." By James Robertson (Step Two Designs). From Step Two Designs' KM Column (July 2003). "With the rise in popularity of both XML and content management systems (CMS), there are an increasing number of tenders specifying 'CMS must be built using XML'. What does this mean in practice? This article explores the role of XML in the context of content management systems, focusing specifically on the business issues; its goal is to 'demystify' the area, and provide organisations with more information on which to base their CMS selection processes and criteria... There are a number of key areas in a content management system where XML has a role to play, including: authoring, communication, interoperability, storage, and publishing... [Authoring] To realise many of the benefits offered by XML in a CMS, content must be captured in a structured way. This means moving away from unstructured information sources, to an environment where the content is authored in a more controlled way. This is a complex and evolving area, but a few key realisations have become apparent: (1) Organisations must move away from using tools such as word processors to author content, as there is no automated way to convert unstructured sources to XML. (2) There are benefits to be gained by having authors use an XML-aware editor as part of a CMS. (3) The users must not be exposed to the complexity of XML. The fact that the content is stored as XML should be invisible to authors and editors. (4) Capturing content as HTML is not desirable, as this severely limits the potential benefits delivered by XML elsewhere in the CMS. (5) While some systems claim XML-compliance through the use of XHTML (the XML version of HTML), this provides no practical benefits over using plain HTML... A number of vendors have developed solutions around the use of XML-based editing environments, and these show promise. There is, however, no consistent approach, and most XML-based solutions are relatively immature... In general, organisations should determine their specific business requirements, and list these in the CMS tender. If there are specific requirements for XML capabilities, these should be outlined in detail. Without the further development of XML standards specifically relating to content management, the use of XML to support interoperability is currently limited. Overall, it is therefore not meaningful to specify that a CMS should 'support XML'. At present, it is more important to select a product that meets all the organisation's business requirements, than to choose a system that offers XML features..." Paper also in PDF format.

  • [June 24, 2003]Presence Information Data Format (PIDF). By Hiroyasu Sugano (Fujitsu Laboratories Ltd), Shingo Fujimoto (Fujitsu Laboratories Ltd), Graham Klyne (Nine by Nine), Adrian Bateman (VisionTech Limited), Wayne Carr (Intel Corporation), Jon Peterson (NeuStar, Inc). IETF Network Working Group, Internet Draft. Reference: 'draft-ietf-impp-cpim-pidf-08.txt'. May 2003, expires November 2003. 27 pages. Work product of the IETF Instant Messaging and Presence Protocol Working Group (IMPP). "This memo specifies the Common Profile for Presence (CPP) Presence Information Data Format (PIDF) as a common presence data format for CPP-compliant Presence protocols, and also defines a new media type 'application/pidf+xml' to represent the XML MIME entity for PIDF. The Common Profiles for Instant Messaging (CPIM) and Presence (CPP)specifications define a set of operations and parameters to achieve interoperability between different Instant Messaging and Presence protocols which meet RFC 2779. This memo further defines the Presence Information Data Format (PIDF) as a common presence data format for CPP-compliant presence protocols, allowing presence information to be transferred across CPP-compliant protocol boundaries without modification, with attendant benefits for security and performance. The format specified in this memo defines the base presence format and extensibility required by RFC 2779. It defines a minimal set of presence status values defined by the IMPP Model document A Model for Presence and Instant Messaging (RFC 2778). However, a presence application is able to define its own status values using the extensibility framework provided by this memo. Defining such extended status values is beyond the scope of this memo. Note also that this memo defines only the format for a presence data payload and the extensibility framework for it. How the presence data is transferred within a specific protocol frame would be defined separately in a protocol specification." Section 4 'XML-encoded Presence Data Format' "defines an XML-encoded presence information data format (PIDF) for use with CPP compliant systems. A presence payload in this format is expected to be produced by the PRESENTITY (the source of the PRESENCE INFORMATION) and transported to the WATCHERS by the presence servers or gateways without any interpretation or modification. A PIDF object is a well formed XML document. It must have the XML declaration and it should contain an encoding declaration in the XML declaration, e.g., <?xml version='1.0' encoding='UTF-8'?>. If the charset parameter of the MIME content type declaration is present and it is different from the encoding declaration, the charset parameter takes precedence. Every application conformant to this specification MUST accept the UTF-8 character encoding to ensure the minimal interoperability... XML Encoding Decision: "The Presence Information Data Format encodes presence information in XML (eXtensible Markup Language). Regarding the features of PRESENCE INFORMATION discussed above, such that it has a hierarchical structure and it should be fully extensible, XML is considered as the most desirable framework over other candidates such as vCard (vCard MIME Directory Profile, RFC 2426)." See: "Presence Information Data Format (PIDF)." [IETF Source: draft-ietf-impp-cpim-pidf-08.txt]

  • [June 24, 2003] "WS-Trust: Interoperable Security for Web Services." By Paul Madsen. From O'Reilly WebServices.xml.com (June 24, 2003). "Released at the same time as WS-SecurityPolicy, WS-Trust is a proposal that enables security token interoperability by defining a request/response protocol by which SOAP actors can request of some trusted authority that a particular security token be exchanged for another. WS-Trust is motivated by more than enabling interoperability between the multiple formats for security tokens that might be used in a WS-Security protected message. It also addresses the issue of trust interoperability. Even if a given security token's format is acceptable to a recipient of a WS-Security-protected SOAP message, interoperability at the syntax level is no guarantee that the recipient will be able to trust the token. We demonstrate how WS-Trust can enable interoperable WS-Security based message-layer security. The client and service in our scenario were insulated from the fact that they share neither a common security token format nor direct trust through the intervention of the gateway and STS in message processing on their behalf. The benefit of this model is more transparent security to the client and service. The burden of understanding security token formats and managing trust relationships with external entities is removed from their shoulders (actually those of their developers and administrators) and placed on the STS (a service dedicated to this role). This model of dedicated security services accessed through SOAP-based interfaces is also relevant for other security processing (e.g. digital signature creation, data encryption, authorization decisions etc.) With the burden of such functionality removed, the business applications can concentrate even more on the business logic and leave security to the infrastructure. Additionally, because security processing is performed in a centralized manner, the model offers advantages for ensuring these security protections are applied in a consistent and policy-respecting manner..." See: (1) "Web Services Trust Language (WS-Trust)"; (2) the 2002-12 news item "Microsoft and IBM Publish Six New Web Services Security and Policy Specifications."

  • [June 24, 2003] "Rendezvous with Web Services." By Massimiliano Bigatti. From O'Reilly WebServices.xml.com (June 24, 2003). "Zeroconf is a technology, being developed in the IETF, to enable Zero Configuration IP Networking. This means you could network computers together without the usual headache of complex configuration. Zeroconf also allows computers to find services, such as printers, without a directory server. Apple is branding ZeroConf as Rendezvous, and using this emerging technology as a substitute for the old AppleTalk standard, using it in Mac OS 10.2 Jaguar and in some iApps like Safari. This article will introduce Zeroconf and demonstrate how it can be used with web services. Functionality similar to ZeroConf has already been provided, in one form or another, by AppleTalk, Novell IPX, and Microsoft Netbios. The main problem is that these proprietary protocols don't work on IP networks and are suited only to local networks. Zeroconf aims at filling this gap, providing a royalty-free, open standard. Zeroconf defines four areas of functionality: (1) Automatic Interface Configuration; (2) Automatic Multicast Address Allocation; (3) Name-to-Address Translation and vice versa; (4) Service Delivery... Rendezvous is an amazing technology that brings simplicity to networking, making it easy to create local networks and with multiple devices, services and data sharing. Jrendezvous, makes this possible with Java. Although new, ZeroConf already has a fair bit of support; for example in Captain FTP, for instant discovery of Rendezvous enabled FTP Servers, or in Hydra, for cooperative editing of source files in an Extreme Programming way. Rendezvous and web services are natural combination: once services are discovered with Rendezvous, SOAP and XML provides a flexible and standard platform for communication..."

  • [June 24, 2003] "WSDL Tales From The Trenches, Part 2." By Johan Peeters. From O'Reilly WebServices.xml.com (June 24, 2003). "In Part 1 of 'WSDL Tales From the Trenches' I painted a big picture of web services design: Web Services Description Language (WSDL) only defines the syntax of how a web service may be invoked; it says nothing about its semantics. I will observe this distinction in what I say in this article about WSDL. The version of WSDL being most widely used now, 1.1, is published as a W3C Note. It is not an official standard. WSDL 1.1 offers a lot of latitude for invoking web services. Tool support tends to be patchy. WSDL gets a lot of bad press because it got the trade-off wrong between expressivity and flexibility on the one hand and verbosity and complexity on the other. Before we proceed, I will come clean and tell you that I am at a loss to tell you how to write truly clear, crisp WSDL... it helps to use an XML-aware editor when you write WSDL, preferably one with the capability to validate the WSDL document. When retrofitting WSDL to existing web services, I find it very useful to be able to generate, send, and receive messages from the editor... [As to modular web service descriptions:] Using the import keyword, you can separate a WSD into modular documents. An example in the W3C Note uses three documents, which contain, respectively, data type definitions, abstract definitions, and specific service bindings. The specific or concrete service definitions depend on the abstract service definitions, which in turn depend on the data type definitions. Apart from improving readability, this technique also improves opportunities for certain types of extension and reuse: the same data type definitions can be used across many abstract services, and the same abstract services can be offered through many different bindings, at many addresses. Initially, a set of web services may be represented as a set of three documents. As services grow, however, this may evolve into a tree of documents with the data type definitions at its root, branching into several abstract services documents, and further fanning out to concrete services... The draft available at the time of writing says that the 'targetNamespace attribute information item defines the namespace affiliation of top-level components defined in this definitions element information item. Messages, port types, bindings and services are top level components.' Whether WSDL 1.2 will support implementing several interfaces in a single service is hotly debated right now. The WSDL 1.2 draft states explicitly that the same rules against sharing namespaces with imported documents apply as in XML Schema. On the other hand, an alternative mechanism for modularizing descriptions is provided via an include element modeled on XML Schema's include element that does allow sharing of namespaces..." General references in "Web Services Description Language (WSDL)."

  • [June 24, 2003] "A Session Initiation Protocol (SIP) Event Package for Modification Events for the Extensible Markup Language (XML) Configuration Access Protocol (XCAP) Managed Documents." By Jonathan Rosenberg (dynamicsoft). IETF SIMPLE Working Group, Internet Draft. Reference: 'draft-ietf-simple-xcap-package-00'. June 23, 2003; expires December 22, 2003. 17 pages. "This specification defines a Session Initiation Protocol (SIP) event package for finding out about changes to documents managed by the Extensible Markup Language (XML) Configuration Access Protocol (XCAP). XCAP allows a client to manipulate XML documents on a server which contain configuration information for application protocols. Multiple clients can potentially access the same document, in which case the other clients would like to be notified of a change in the document made by another. This event package allows a client to do that..." Section 5 describes the relevant application/xcap-change+xml MIME Type. See: (1) "IETF Publishes Internet Drafts for XML Configuration Access Protocol (XCAP)"; (2) IETF SIP for Instant Messaging and Presence Leveraging Extensions (SIMPLE) Working Group. [IETF source]

  • [June 24, 2003] "Oracle Intros New Collaboration Applications. Offerings Include Project Management, Project Collaboration and Project Intelligence." By Scarlet Pruitt. In InfoWorld (June 24, 2003). "Oracle has introduced three new collaboration applications and an enhanced product lifecycle management tool for its E-Business Suite as part of what the company has called a second wave of e-business offerings that offer companies deeper integration... As part of this strategy, Oracle released three new collaboration applications: Oracle Project Management, Oracle Project Collaboration and Oracle Project Intelligence. The Project Management application allows project managers to plan and schedule projects, create progress reports, staffing plans and other documents which can be accessed through a Web interface. The Project Collaboration application tool is aimed at giving project team members the ability to see each other's information, such as work plans, change orders and status reports, while the Project Intelligence product offers metrics and analytics for projects, as well as the ability to do opportunity bookings and resource utilization, Oracle said. Oracle also expanded its product lifecycle management applications Tuesday, rolling out a new Advanced Product Catalog tool that centralizes all product and component information into a central catalog..."

  • [June 24, 2003] "W3C Issues Key Web Services Standard." By Paul Festa. In CNET News.com (June 25, 2003). "The Web's leading standards group this week put its stamp of approval on a key Web services protocol. The World Wide Web Consortium (W3C) said it has published the Simple Object Access Protocol (SOAP) version 1.2 as a formal standard. SOAP is one of a handful of standards behind the industry move toward building Web services software. The protocol originated several years ago as an informational document within the Internet Engineering Task Force (IETF), another standards group, as a way of executing so-called remote procedure calls. Microsoft then jump-started work on the protocol as a way of letting business software applications communicate over the Web, regardless of what programming language they're written in... With the release of the SOAP recommendation, both commercial software developers and information technology workers within businesses can now use the standard without fear of incompatibilities -- as long as they adhere to the W3C's definition of SOAP. The W3C stressed that, despite its numbering, SOAP 1.2 was in a sense the first of its kind. 'There were significant technical changes between 1.0 and 1.1, but SOAP 1.1 was never ratified by any independent organization,' W3C representative Janet Daly wrote in an e-mail exchange. 'SOAP Version 1.2 is the first SOAP spec to go through any kind of independent development and review -- one could say it's the first SOAP standard.' The W3C also took the opportunity to emphasize its role as an arbiter of Web services standards that will govern the infrastructure required to let commercial applications communicate and interact over the Web. 'Web services make good on the promise of interoperable applications only when the technical foundations are shared, robust, and achieve expected performance,' Web inventor Tim Berners-Lee, the W3C's director, said in a statement. 'Today W3C members have endorsed...the first version of SOAP to have undergone rigorous testing and implementation and to support a full complement of Web standards'..." See details in "SOAP Version 1.2 Published as a W3C Recommendation."

  • [June 24, 2003] "W3C Ratifies SOAP 1.2. Web Services Specification is Finalized." By Paul Krill. In InfoWorld (June 24, 2003). "Version 1.2 of the SOAP specification, a foundational technology for Web services, has been released by the World Wide Web Consortium (W3C), with vendors pledging support in products. The revised specification, used for exchanging structured information in distributed Web services environments, features improvements such as better error handling and internationalization, an upgraded processing model, and alignment with the W3C Web architecture. 'I think we did a better job [with Version 1.2] of defining the underpinnings of what is the SOAP model,' said David Fallside, chairman of the W3C XML Protocol Working Group, which devised SOAP 1.2. Fallside also is a senior technical staff member at IBM, in San Jose, Calif. Version 1.2 consists of a Messaging Framework, featuring a processing model, or rules for processing a SOAP message. The processing model removes ambiguities found in SOAP 1.1. Also featured is Adjuncts, for representing remote procedure calls, for encoding SOAP messages and describing SOAP features and bindings. Adjuncts also provide a standard binding of SOAP to HTTP, enabling SOAP messages to be exchanged using the mechanisms of the Web. Other components of Version 1.2 include Specification Assertions and Test Collection, providing a set of tests drawn from assertions in the Messaging Framework and Adjuncts. The tests show whether the assertions are implemented in a SOAP processor and are designed to foster interoperability between different implementations of the SOAP 1.2 specification, according to W3C..." See details in "SOAP Version 1.2 Published as a W3C Recommendation."

  • [June 24, 2003] "XML Data Binding with JAXB and UBL Source Code. Process XML Documents Without SAX or DOM." By Ed Mooney and Joseph Fialli (Sun Microsystems). In Java Developer's Journal Volume 8, Issue 6 (June 2003), pages 46-50. "XML data binding relieves the pain of any Java programmer who has ever winced at having to work with a document-centric processing model. Unlike SAX and DOM, which force you to think in terms of a document's structure, XML data binding lets you think in terms of the objects the structure represents. It does so by realizing that structure as a collection of Java classes and interfaces... This is especially valuable when lots of applications use the same document schemas. Then the data binding approach yields a set of standard classes and interfaces that are reused across all the applications. This saves work since you don't have to write, debug, and maintain code to extract data from XML. There are even more savings if you're developing an application for one of the many industries that have agreed on standard XML Schemas for business data interchange: finance, travel, auto, and retail, to name just four. This article will look at two new standards: JAXB and UBL... Java Architecture for XML Binding (JAXB) was developed in Java Specification Request (JSR) 31. It was written by an industry expert group under the auspices of the Java Community Process. By standardizing the XML data binding interface and providing a conformance test, JAXB allows you to choose among different XML binding implementations without having to rewrite your application. JAXB also comes with a standard implementation, which we'll use to show you how to bind the UBL schema to Java objects... Universal Business Language (UBL) is an XML-based business language built upon existing EDI and XML business-to-business vocabularies. It's the product of the UBL Technical Committee of oasis. The committee intends to have UBL become an international standard for electronic commerce. If you're a J2EE programmer, there's a good chance UBL will be a part of your future. The latest UBL 0.7 release contains schema, sample XML documents, specifications, and documentation. It's perfect for experimenting with UBL applications. We're going to do just that using Java bindings generated by JAXB from the UBL schema..." With source code. See general references in "Universal Business Language (UBL)." [alt URL]

  • [June 24, 2003] "A Brief History of Tags: Using a Tags-Based Approach." By Rich Rosen (Application Architect, Wall Street Journal). In Java Developer's Journal Volume 8, Issue 6 (June 2003), pages 10-22. With source code. "Custom tags in JavaServer Pages have come a long way since their inception. Now that Sun has provided some standards for these tags in the form of JSTL (and the up-and-coming JavaServer Faces), and has promised additional support for these standards in JSP 2.0, let's look at how we got to this point in tag history, and where we're going in the future. In addition, we'll look at how we can use the JSTL taglibs and the Struts Taglibs that support the JSTL expression language right now... Tag-based approaches to Web application development are nothing new. Their origins can be traced back to HTML (since they mimic HTML's syntax), and are represented by such varied approaches as SSI, Macromedia's ColdFusion, Microsoft's Active Server Pages (ASP), and, of course, JSP... [Consider] JSP: Model 1 vs Model 2: One of the big problems with JSP Model 1 was that it lent itself to bloated 'monolithic JSPs' that combine programming logic and presentation format in one module. Monolithic JSPs violate the principle of 'Separation of Content from Presentation,' to be sure. It's only when you have to maintain such JSPs in production applications that you begin to understand the importance of that principle in practice. JSP Model 2 is an approach to Web application development that adheres to the Model-View-Controller (MVC) paradigm. Sun's vision for Model 2 is that the controller would be a servlet, the model would be represented by JavaBeans (or EJBs in more sophisticated applications), and the view would be comprised of JSPs that contain only presentation formatting constructs (i.e., no code). The presence of Java code in a JSP leads to the previously mentioned 'monolithic JSP' syndrome, where data access and manipulation logic that belongs in the controller or model component of the application finds its way into the view component (the JSP). The intermixing of code with presentation formatting constructs results in a cluttered, unwieldy page that's not only difficult to maintain, it's not clear who is supposed to maintain it. Custom JSP tags, a feature added to the JSP specification in version 1.1, makes it possible to achieve this desired separation of code from formatting. By encapsulating functionality in a single atomic entity that can perform complex processing that would have required a substantial amount of Java code, tags reduce (if not eliminate) the amount of code within a JSP page... Well-designed tags allow a page designer to address and access data from the model that's constructed and manipulated by the controller. The decision about which data goes into the model (and which JSP view to employ for presentation) is in the hands of the controller. Thus, all a JSP developer needs to worry about is the layout of data already accessed and organized for presentation... In an MVC approach to Web application development, separation of content from presentation is critical. The key to this separation is a clear definition of responsibilities, with programmers responsible for model and controller components, and page designers responsible for view components. Code embedded in view components, as found in monolithic Model 1 JSPs, makes it impossible for designers to have true autonomous responsibility for those components. A tag-based approach that eliminates code from JSPs facilitates the separation of responsibilities and enables each group, designers and programmers, to do their job without stepping on each other's toes..." [alt URL]

  • [June 24, 2003] "Geopriv Authorization Policies." By Hannes Tschofenig and Jorge R Cuellar (Siemens AG, Corporate Technology, Munich, Germany). Internet Engineering Task Force Internet Draft. Reference: 'draft-tschofenig-geopriv-authz-policies-00.txt'. June 2003, expires December 2003. 16 pages. Submitted to the IETF Geographic Location/Privacy Working Group. "This document describes authorization policies for usage with Geopriv. It suggests using the eXtensible Access Control Markup Language (XACML). XACML provides functionality required to express policies for access to location information... Geopriv provides Location Information in a secure and private way. A critical role is played by user-controlled Privacy Rules, which describe the restrictions imposed or permissions given by the Rule Maker. The Privacy Rules specify the necessary conditions that allow a Location Server to forward Location Information to a Location Recipient, and the conditions under which and purposes for which the Location Information can be used. One type of Privacy Rules specify in particular how location information should be filtered, depending on who the recipient is. Filtering is the process of reducing the precision or resolution of the data. A typical rule may be of the form: 'my location can only be disclosed to the owner of such credentials in such precision or resolution' (e.g., 'my co-workers can be told the city I am currently in'). The Location Object should be able to carry a limited but core set of Privacy Rules. The access to location information (as XML objects) can be controlled by XACML policies. The same is true for writing and deleting Geopriv rules themselves. The Geopriv working group can benefit from reusing existing work on access control." See details in the news story "Six New Internet Drafts from the IETF Geographic Location/Privacy Working Group." [IETF source URL]

  • [June 24, 2003] "Updated Eclipse Toolkit Shines. Eclipse SDK 2.1 Leverages Java's Strengths in IDE Toolkit, But Beware of Too Much Expansion." By Rick Grehan. In InfoWorld (June 20, 2003). "Eclipse is written in Java, and Java suites Eclipse's extensibility well. But such easy extensibility is fertile ground for overgrown features. Eclipse is the umbrella name for three related projects: the Eclipse Project, the Eclipse Tools Project, and the Eclipse Technology Project. The Eclipse Project oversees development of the Eclipse IDE platform and the JDT (Java Development Tooling) -- a Java development environment built on the Eclipse platform. It also sets the code and specifications for the plug-in development environment... The Eclipse Tools Project coordinates many development environment implementations on the Eclipse platform, including CDT (C/C++ Development Tool), which creates a C/C++ IDE out of Eclipse; a GEF (Graphical Editing Framework); and -- hold your breath -- a COBOL IDE. Further along than the GEF and the COBOL IDE, the CDT already ships with an editor, a debugger, and a number of wizards. Finally, the Eclipse Technology Project scouts future directions for the Eclipse platform... Most of the additions to the Eclipse platform and the JDT are associated with editing, but there are also a host of incremental modifications in code generation (specific to the JDT) and enhancements to the PDE. Many of the latter additions provide tools that help developers discover and navigate among plug-in dependencies. These tools will be most helpful to those who want to write Eclipse plug-ins, but not to the larger group who will use Eclipse as an IDE. Individually, each addition or enhancement is reasonable, even downright clever. But each adds its volume to an already densely-packed and increasingly daunting UI, which can become overwhelming if you don't monitor its growth. It gets a solid thumbs-up, but Eclipse will have to extend itself cautiously, lest it begin to stagger under the weight of all its spiffy features..."

  • [June 24, 2003] "Intel, Universities Create World Network." By Michael Kanellos. In CNET News.com (June 23, 2003). "Intel, Princeton University, the University of California at Berkeley, and a host of other academic and industrial heavyweights have banded together to take the lag out of getting data from halfway around the world. PlanetLab is an experimental network that sits on top of the Internet that will allow researchers and others to test and build applications that can essentially span the globe. Work accomplished at PlanetLab is expected eventually to permit sites to broadcast video from computers located around the world in a coordinated fashion to swarms of users simultaneously without bogging down access. Similarly, virus hunters should be able to detect the spread of new viruses or denial-of-service attacks early... When completed, the network will consist of 1,000 servers geographically dispersed around the world. Participating researchers will then be able to use slivers of the network as a test bed for optimizing applications to run on multiple computers or pull data out of multiple storage systems. A substantial portion of the work will revolve around trying to compensate for traffic patterns and delays that crop up on the Internet...Creating a network segregated from the mass of the Net also allows researchers to examine solutions for structural problems with the Web itself. 'The Internet has become more rigid and far more brittle,' Culler said. 'That structure limits how much you can morph or change...Typically, applications are built on a few massive servers.' The network, which was conceived in March 2002, consists of 160 computers dispersed to 65 sites in 16 countries. The machines run a modified version of Red Hat's Linux. The consortium expects to have 300 machines running by the end of the year. The full 1,000-computer network is expected to be complete in a few years..."

  • [June 24, 2003] "FDIC Project Tries Out XBRL." By Diane Frank. In Federal Computer Week (June 18, 2003). "Extensible Business Reporting Language (XBRL) is largely an untested standard for government and industry in the United States, but a new modernization project at the Federal Deposit Insurance Corp. could give XBRL some momentum. The FDIC's Call Report Modernization Project will use XBRL -- a variation on the standard Extensible Markup Language agencies use to enhance electronic transactions and communications. The FDIC is turning to the format to ensure that the agency and the banks it oversees are working with the same forms and processes, said Phil Walenga, assistant director of the FDIC's bank technology group. He was speaking June 17, 2003 at the Making E-Government Count: The New Era of Financial Management and Reporting conference hosted by the Council for Excellence in Government in Washington, D.C. Agencies are familiar with XML's advantages when it comes to interoperability and exchanging data. Some of the cross-agency e-government initiatives, such as Grants.gov, are developing their own community-specific XML schemas. XRBL includes the ability to add extra 'modules' to schemas, such as definitions of business processes in addition to data definitions, so it has the potential to serve many more needs, Walenga said. The federal Joint Financial Management Improvement Program endorsed XBRL in late 2001 for financial reporting, and the Call Report Modernization project already includes the Federal Reserve and the Office of the Comptroller of the Currency. But the standard is intended for any business function, stressed John Turner, vice chairman of the domain working group on the XBRL International Steering Committee. The FDIC also is talking to other agencies, including the Census Bureau, about the benefits of using XBRL, Walenga said..." See: (1) the announcement "Federal Banking Regulators Award Unisys Outsourcing Contract to Transform the Collection of Bank Data. Call Agencies to Employ Web Services and XBRL to Streamline Bank Call Report Processing."; (2) general references in "Extensible Business Reporting Language (XBRL)."

  • [June 24, 2003] "Service-Oriented Architecture Should Guide Migration to Web Services." By Gerrie Swart. From ITWeb (June 24, 2003). "Web services has been the subject of much discussion, hype and promotion by the software industry and analysts, but companies that are adopting these technologies may encounter unexpected complexity if their architecture is not closely controlled. That's the view of Gerrie Swart, product manager at Compuware SA, who says service-oriented architecture (SOA) is necessary for the manageable and accountable roll-out of Web services. He explains that SOA is a set of structural features that go deeper into the corporate IT system than Web services. 'Where Web services address the communications mechanisms to allow functions to be made public, SOA addresses the layering and structure of the software stack. By moving towards SOA through exposing core services in a loosely coupled way, companies can reduce complexity, improve re-use and enable agility,' he says. Key features of SOA include platform-independent interfaces, loose coupling of applications, business level granularity, and discoverability of applications. 'Business processes are comprised of a number of steps. By increasing the granularity of these steps, extra functionality can be realised as each sub-process can be used by different applications through Web services,' says Swart. For example, checking stock levels is a sub-process of order processing, and can also be used for stock replenishment functions and potentially other business processes... Compuware's OptimalJ product is an application development environment that fully implements a model-driven approach to application design and development, while encapsulating SOA and Web services standards. Model-driven tools start with a logical business model and automatically transform this into an application. In doing this, OptimalJ automatically generates applications closely aligned to the business model while conforming to SOA and supporting Web services..."

  • [June 24, 2003] "Trusting ID Management Technology. The Escalating Need for Identity Management Systems is Driving Privacy Concerns to the Forefront." By Jack McCarthy and Brian Fonseca. In InfoWorld (June 20, 2003). "When companies forge partnerships with suppliers, clients, and customers, they expose their systems to security breaches not only by their own employees but their partners' employees as well. How can a chief technologist gain control over access to a company's secure resources? The answer seems to lie in a robust identity management system, which gathers and manages employees' personal data, ensures the approval of those whose data is being used, and offers ironclad security. On the surface, identity management offers many protections, but lurking beneath are the many thorny issues still surrounding privacy and trust... Two groups are driving the push for identity management. The Liberty Alliance, which now includes more than 160 technology vendors and end-user companies, is building open technical specifications that enable information-sharing relationships among employees, customers, and partners. The second 'group' stems from an informal arrangement Microsoft and IBM have with each other to also work out federated network identity security standards based instead on Web services standards such as WS-Security. Although the Liberty Alliance and the Microsoft-IBM joint venture both endorse a federated identity management model of creating trusted groups of partners and clients, the two camps differ in their approach. Microsoft and IBM are building a set of Web services security standards for operating systems and applications servers and platforms. They are also pursuing SSO (single sign-on) security systems such as in Microsoft's .Net Passport and IBM's Tivoli Identity Manager. In contrast, the Liberty Alliance is developing its own set of open specifications and solutions for federated identity management that securely share applications. In the future, the Liberty Alliance hopes to also construct open specifications that will link its efforts to the Web services being developed by IBM and Microsoft..."

  • [June 24, 2003] "J2EE 1.4 Eases Web Service Development. Java's New Web Service Client and Server Programming Models." By Frank Sommers. In Java World (June 20, 2003). "The latest J2EE (Java 2 Platform, Enterprise Edition) specification, version 1.4, makes Web services a core part of the Java enterprise platform. A set of JSRs (Java Specification Requests) in the Java Community Process define how J2EE components can become Web services, how existing enterprise Java applications can invoke Web services, and adds new interoperability requirements for J2EE containers... Perhaps the most significant, and most consequential, additions to J2EE are the new interoperation requirements. The requirements prescribe support for SOAP (Simple Object Access Protocol) 1.1 in the J2EE presentation layer to facilitate XML message exchange. J2EE 1.4-compliant containers must also support the WS-I (Web Services Interoperability Consortium) Basic Profile. Since XML message exchange in J2EE depends on JAX-RPC, the JAX-RPC specifications now mandate WS-I Basic Profile support as well. The result is that a J2EE 1.4-based application can be invoked as a Web service, even from applications not written in the Java programming language. While that is an evolutionary step for J2EE, since the platform has long embraced non-Java based systems, it is possibly the most direct way to facilitate interaction with Windows-based technologies that rely on .Net. A J2EE-based service's client does not have to be aware of how a service is implemented. Rather, that client can use the service by relying entirely on the service's WSDL (Web Services Description Language) definition... While the J2EE specs do not spell out the exact mechanics of such interaction, J2EE 1.4's embrace of the WS-I Basic Profile, which Microsoft also claims to follow, will likely make J2EE-.Net interaction common..."

  • [June 24, 2003] "Slowly Weaving Web Services Together." By Alex Salkever and Olga Kharif. In BusinessWeek (June 24, 2003). As of the "summer of 2003, and Web services remains a buzzword, but hardly of the stature Bill Gates implied when he called .Net possibly a 'bet the company' initiative. According to a report issued in June by consultancy and full-time Colossus of Redmond watcher Directions on Microsoft: 'Three years later, most of the hopes behind the .NET initiative have not been realized.' The intelligent software that was supposed to weave all types of devices into the fabric of human life hasn't taken hold. Fear of Microsoft has scuttled most of the giant's plans to host customers' critical data. True, businesses are starting to adopt the Web-services model for linking their internal systems and talking to customers. Yet even in that respect, this approach is proving to be more evolutionary than revolutionary. In fact, Web services has become mostly about making software integration easier, faster, and cheaper. That alleviates a pain point for many companies, but it's hardly the dawn of a new era... Instead of exploding, the Web Services movement to help disparate computer systems easily communicate is gaining in fits and starts. Still, it'll likely have a powerful impact. The key piece of Web services is XML, a format for transmitting data that any Internet-connected device can read. Web services has its dark sides, of course; it leaves company networks more porous. As larger numbers of outsiders log into key systems, sifting out friends and foes becomes a more difficult task. From inauspicious beginnings, Web services will emerge as one of the Net's most meaningful contributions..."

  • [June 24, 2003] "Xerox Previews DocuShare Add-On. Company Adds Collaboration Features to Document Management Software." By Marc Ferranti. In InfoWorld (June 23, 2003). "Xerox used the CeBIT America show last week to stage a sneak-peek demonstration of a new add-on program designed to give workgroup and collaboration features to the company's DocuShare Web-based document and content management software. The new add-on program, dubbed Sparrow, is intended to let geographically dispersed users share and annotate documents over the Internet, according to Colman Murphy, DocuShare product manager. It will be available when DocuShare 3.1 is released in September, he added... Once users gain access to Sparrow over the Web, they see documents placed within Sparrow-formatted templates. The templates are designed to let users click on documents, annotate and edit them. The changes are immediately reflected in the documents in Sparrow. Like DocuShare, Sparrow is based on the Java programming language, and it has automated features designed to let users convert HTML (Hypertext Markup Language) documents and insert them into program templates, Liang said. Sparrow uses the DocuShare database to store documents, and requires DocuShare to run. As such it is mainly targeted at the current crop of 40,000 DocuShare users. However, some potential users may be attracted to DocuShare specifically because of Sparrow. For example, beta-tester Phyllis Jacobson, a consultant in Sacramento, Calif., who works for the state's Commission on Teacher Credentialing, said the state agency will acquire DocuShare on the basis of its test project with Sparrow..."

  • [June 24, 2003] "Microsoft Readies Kit for Security Initiative. Developers to Get Early Look at NGSCB." By Paul Krill. In InfoWorld (June 19, 2003). "At the October Microsoft Professional Developers Conference, Microsoft plans to release a preliminary software development kit for its Next-Generation Secure Computing Base (NGSCB) security technology, also known as Palladium. The kit will give developers an early opportunity to work with the NGSCB code in preparation for developing applications that take advantage of the technology, according to Microsoft. The company hopes to introduce NGSCB itself in the Longhorn version of the Windows client operating system, which is due in 2005... The kit will be an API set that functions with 'standard' programming languages, Microsoft officials said. NGSCB is intended to provide for trusted operations on a PC and requires changes to the Intel CPU architecture, meaning users would need to buy new PCs to take advantage of the technology. Microsoft is working with Intel on redesign of some CPU, chipset, and I/O components that would be required to accommodate NGSCB, Juarez said. NGSCB focuses on enabling strong process isolation, sealed storage, a secure I/O path to and from the user, and attestation. Attestation, according to Microsoft, is the ability for a piece of code to digitally sign or attest to a piece of data and further ensure the signature recipient that the data was constructed by an unforgeable, cryptographically identified software stack, according to Microsoft... NGSCB provides an environment for building a trusted infrastructure, he said. It is initially eyed for Windows clients, with servers to be a focus afterward, he added. But the technology has been criticized as potentially curtailing user control over their own PCs, potentially eroding fair-use rights for digital music and movie files. Suarez said Microsoft's intention is not to build an overarching digital rights management scheme with NGSCB, but acknowledged that it could be used for that purpose. NGSCB is first intended for enterprise business and government use and will not make its way to home or consumer use for some time after that, said Suarez... An analyst said NGSCB may have a limited market, for applications such as financial and government systems, but it is not about digital rights management. 'I don't think this is a backhanded, sneaky attempt to foist DRM on the market,' said the analyst, Matt Rosoff, of Directions on Market, an independent research firm in Kirkland, WA..."

  • [June 24, 2003] "CSS 3 Selectors." By Russell Dyer. From XML.com (June 18, 2003). "Although the promise of Cascading Style Sheets (CSS) has been wondrous, the progress has been wanting. As with all W3C standards, there is the lengthy discussion process conducted by the related working group, then the problem of implementation by web browser vendors, and finally the unpredictable period of time for people to update to new versions of their browser. These steps can take a year or two each. To expedite the process, the CSS working group has started grouping related features into modules and releasing them separately, without waiting for completion of all modules. This allows browser vendors to proceed with implementation of CSS updates with the confidence that standards won't be changed by the time the full release of CSS is approved. Ideally the result will be to reduce the process by a year or more. One CSS module that has recently been moved to Recommended (or finished) status by the CSS working group is the Selectors module. Although it was completed just a few weeks ago, much of it has already been implemented by some of the browser vendors. The vendors seem very keen to expedite the Selectors module since it can improve HTML and XML document design decidedly and for fairly little effort. As CSS grows, more selectors are adopted, although some can be pruned. This article will review all currently approved selectors. It will address the surviving CSS1 and CSS2 selectors in brief and the new CSS3 selectors in more detail... A cascading style sheet is used to set or change the styles of a markup document like a web page. Style sheets contain rules that web browsers follow when loading a web page, an XML document, or any document written in a compatible markup language. A style sheet rule is composed of two basic components: a selector and a declaration. The selector is an identifier of a markup tag..." See also: (1) Cascading Style Sheets: The Definitive Guide, by Eric Meyer; (2) W3C Cascading Style Sheets (CSS) Home Page; (3) general references in "W3C Cascading Style Sheets."

  • [June 24, 2003] "Transforming XML with PHP." By Bruno Pedro. From XML.com (June 18, 2003). "This article compares two methods of transforming XML in PHP: PEAR's XML_Transformer package and the W3C XML transformation language XSLT. I will first describe the PEAR project and its philosophy, with a focus on its XML transformation techniques. I will then give a brief introduction to XSLT and the way to use it from PHP. Introduction PEAR's main goal is to become a repository for PHP extensions and libraries. Its members try to standardize the way developers write portable and re-usable code. PEAR offers a wide variety of packages ready to use by PHP developers. Most PEAR packages are subclasses of the standard base classes. One of these packages is the XML_Transformer. This package was created to help you transform existing XML files with the help of PHP code. XSLT stands for 'Extensible Stylesheet Language Transformations' and is a W3C Recommendation. As most readers know, it is a powerful implementation of a transformation language for converting XML into either XML, HTML or simple text. While you need PEAR to use XML_Transformer, XSL transformations can be processed internally by PHP. PHP offers XSLT functionality at its core, making it easy to incorporate transformation features into existing code. As you can see, both technologies can transform XML files... While PEAR::XML_Transformer gives you greater flexibility through the use of PHP, XSLT is easier to use by non-programmers. XML_Transformer's approach lets you associate an XML element's opening and closing tags with specific functions. XSLT's transformation is tightly coupled with the XML tree. If you plan to build your own set of namespaces and associated PHP libraries, then I think XML_Transformer is the way to go. If you want to give other people the ability to create custom transformations, then I recommend XSLT...

  • [June 24, 2003] "Archivists Say Computers Have No Sense of History." By Kevin Coughlin. In Star-Ledger News (June 19, 2003). "A scientific journal is warning that electronic record-keeping is no match for paper and ink when it comes to preserving history... The bottom line: Future historians may be clueless about the past without a systematic and reliable way of keeping electronic records. 'The pen may be mightier than the sword, but a single mouse-click can destroy products of inestimable value,' a trio of scientists contends in the current issue of Nature. John Carlin, archivist of the United States, claims the republic hangs in the balance. A 'democracy without open access to its government's records is no longer a democracy,' he told computer scientists in 2001. How to store, retrieve and interpret electronic documents in perpetuity -- regardless of changing technologies -- is an urgent question for the National Archives, the Library of Congress, and a global consortium called InterPARES. That's short for International research on Permanent Authentic Records in Electronic Systems. They are searching for 'electronic amber,' a digital equivalent of the resin that has preserved fossilized insects for millions of years... One way or another, the government is obligated to hold on to certain records for a very long time. Carlin said it may be necessary to keep tabs on radioactive materials from nuclear plants for 100,000 years. Authenticating electronic documents and preserving the appearance of original materials are key challenges. (The Declaration of Independence loses something as a simple text file.) Officials from the National Archives and Records Administration are working with the San Diego Supercomputer Center and the Georgia Tech Research Institute as well as the National Science Foundation, the Department of Defense and the Patent and Trademark Office. They hope to have a prototype system for 'persistent archives' in a year or two. One promising approach involves XML, or Extensible Markup Language. It's a system that encodes digital tags for describing and sorting electronic documents..."

  • [June 21, 2003] "Mobile Subset of XML Schema Part 2." ISO Document for information and review. Produced by SC34 Japan for ISO/IEC JTC 1/SC34/WG1: Information Technology -- Document Description and Processing Languages -- Information Presentation. Project 19757-5 (Project Editor, Martin Bryan). ISO Reference: ISO/IEC JTC 1/SC34 N 0410. April 22, 2003. Excerpts: "We propose to create a compact and reliable subset of W3C XML Schema Part 2 and publish it as an ISO standard. The main target of this subset is mobile devices (such as cellular phones). Mobile devices are expected to use XML in the near future. Small XML parsers have been developed already. Validators for schema languages are expected to follow, and a prototypical validator for RELAX NG on mobile phones has been developed. Such parsers and validators will hopefully be used for implementing XForms and Web Service on mobile devices. Part 2 of W3C XML Schema provides a set of datatypes and facets. Although it might not be perfect, it is likely to be widely used by many XML applications including mobile applications. We just cannot believe that an incompatible set of general-purpose datatype (e.g., int) libraries will be accepted by the market. However, datatypes and facets of W3C XML Schema Part 2 are too complicated for mobile devices. Some specifications such as XForms have already created their own subsets of W3C XML Schema Part 2. However, if different specifications introduce different subsets, incomparability will be significantly spoiled. It would be much nicer if one subset is internationally standardized... [In the] choice of datatypes we omit: (1) datatypes requiring infinite precision; (2) datatypes that do not have obvious mapping to J2ME; (3) archaic datatypes such as IDREFS, ENTITY, ENTITIES, and NOTATION; (4) unsolid datatypes -- dateTime and so forth; (5) datatypes such that validity depends on namespace declarations... [In the] choice of facets we omit: [1] the pattern facet, which requires the property list of Unicode characters; [2] whitespace, which does not affect validity but controls PSVI; [3] totalDigits and fractionDigits. Implementation considerations: We have studied the source code of Jing implementation by James Clark. We believe that if the above restrictions are accepted, an implementation of the remaining datatypes and facets will require less than 20KB as the size of a JAR file." See details for the proposed list of datatypes and factes in 'Table 1: The list of datatypes' and 'Table 2: The list of facets'. See: (1) XML Schema Part 2: Datatypes, W3C Recommendation 02-May-2001; (2) general references in "XML Schemas." [cache]

  • [June 17, 2003] "Documents Revisited: eCommerce for Everyone." By Jon Bosak (Sun Microsystems). Presented at the "Technology Roadmap Seminar: Extracting IT from XML and Web Services," hosted by Enterprise Ireland (Glasnevin, Dublin), 12-May-2003. This document represents a slightly revised version of the slide presentation used in Bosak's XML Europe 2003 Opening Keynote Address "The Universal Business Language (UBL)." Available also in OpenOffice format. 24 pages. "[Here is] The Big Picture: the UN Millenium Goal Number 8 is to 'develop further an open trading and financial system that is rule-based, predictable and non-discriminatory and includes a commitment to good governance, development and poverty reduction -- both nationally and internationally.' A fair and open global electronic marketplace would: allow big companies to extend their electronic trading relationships to small and medium-size companies; allow small companies to participate on an equal basis; put small nations on an equal commercial footing with big ones; and solve a lot of economic and social problems. So... how do we do this? The problem is the cost of entry: most e-commerce is B2B, Most B2B is EDI, EDI is expensive, so most small businesses are locked out... The addition of a few key enablers to the ubiquitous Internet can bring small and medium-sized businesses into global e-commerce: (1) A document-centric architecture; (2) A standard royalty-free XML B2B tag set; (3) A standard royalty-free B2B infrastructure; (4) A standard royalty-free office productivity format; (5) Free open-source software to make it all happen. A document-centric architecture envisions that business is built on the concept of standard, legally binding documents. Putting business as we understand it online means putting those documents online... Business is about intent and intent is about meaning. The hard part of e-commerce is the definition of shared semantics. Documents are the universally understood way to convey semantic information. Documents keep humans in the loop [in terms of] error handling, legal action, and records management (audit trail). The system has to remain transparent to humans. XML document standardization is the way to do this... Enabler #2 is a common business tag set, viz., UBL, defining a library of standard electronic business documents. It plugs directly into existing traditional business, legal, and records management practices and eliminates re-keying of data in existing fax- and paper-based supply chains; it fills the payload slot in B2B frameworks such as the UN/OASIS ebXML initiative and various 'web services' schemes, extending global trade to businesses of all sizes..." [adapted; see the full discussion of the five key enablers.] See also: (1) "Bosak on Universal Business Language" by Simon St.Laurent, and similarly (2) "Reports from XML Europe 2003." General references in "Universal Business Language (UBL)."

  • [June 17, 2003] "On Querying Geospatial and Georeferenced Metadata Resources in G-Portal." By Zehua Liu, Ee-Peng Lim, Wee-Keong Ng, and Dion Hoe-Lian Goh (Nanyang Technological University, Singapore). Presented in Session 8b, "Designing and Accessing Scientific Digital Libraries" at JCDL 2003. In Proceedings of the Third ACM/IEEE Joint Conference on Digital Libraries (JCDL 2003), Houston, Texas, USA, May 27 - 31, 2003. With 17 references. "G-Portal is a web portal system providing a range of digital library services to access geospatial and georeferenced resources on the Web. Among them are the storage and query subsystems that provide a central repository of metadata resources organized under different projects. In GPortal, all metadata resources are represented in XML (Extensible Markup Language) and they are compliant to some resource schemas defined by their creators. The resource schemas are extended versions of a basic resource schema making it easy to accommodate all kinds of metadata resources while maintaining the portability of resource data. To support queries over the geospatial and georeferenced metadata resources, a XQuery-like query language known as RQL (Resource Query Language) has been designed. In this paper, we present the RQL language features and provide some experimental findings about the storage design and query evaluation strategies for RQL queries. There are several digital library projects focusing on geospatial and georeferenced information available on the Web. Among them are Alexandria Digital Library (ADL), Alexandria Digital Earth ProtoType (ADEPT), Digital Library for Earth System Education (DLESE), Geospatial Knowledge Representation System (GKRS) and GEOREP. G-Portal is quite distinct compared to these systems due to the following novel features: (1) It supports a storage subsystem for metadata resources represented in XML. The resources are organized into projects and they are compliant with some resource schemas. The adoption of XML and schemas ensures that G-Portal can support a wide variety of metadata resources. (2) G-Portal provides both map-based and classification-based user interfaces to access the metadata resources. The former are suitable for geospatial resources with location information while the latter caters for all kinds of resources with or without location information. The two interfaces coexist and are synchronized. That is, resources selected using the map-based interface will also be highlighted in the classification-based interface, and vice versa. (3) G-Portal allows different classification hierarchies to be defined for metadata resources according to the needs of different digital library user communities. This is a stark contrast to traditional digital libraries adopting only a single classification hierarchy for cataloging data. (4) G-Portal provides user annotation facilities on the metadata resources to support better knowledge sharing. By treating annotations as a type of metadata resource, the annotations can be accessed in the same way as other metadata resources..." [cache]

  • [June 17, 2003] "Intel and Sun Add X Factor to Mobile Java." By Rob Bamforth (Bloor Research). In The Register (June 17, 2003). "There was a time when Java was seen by many as too slow for desktop applications, let alone mobile devices... Over the years two things have happened to greatly improve performance. The general implementation of Java has evolved and matured. Secondly, widespread adoption has led to unexpected collaborations. This is especially true where the caffeine hits the silicon, with Java optimised for hardware. However the most recent collaboration is as surprising as it is important. Intel and Sun, who are direct competitors in several areas, have jointly announced a collaborative effort to optimise Java for the Intel XScale silicon technology. XScale technology is designed for low milliWatt to MIPS ratios to give longer battery life for good compute performance for mobile devices. XScale processors include the PXA255 and PXA26x plus the recently introduced PXA800F cellular processor. Devices based on XScale technology include Palm Tungsten C, Fujitsu Siemens Pocket LOOX, HP iPAQ 5400, Motorola A760, Dell Axim X5, Toshiba e550 series and several of the Sony CLIEs. Sun and Intel are optimising the Hotspot virtual machine (VM) for Sun's Java definition aimed at mobile devices known as the Connected Limited Device Configuration (CLDC). This is one of the two configurations of lighter weight specification of Java for smaller devices, Java 2 Platform, Micro Edition (J2ME). The Hotspot VM is designed to try to get the best software performance for Java by a number of optimisation techniques aimed at the key areas (hot spots) of code which will affect performance the most... This collaboration is significant because of Intel's recognition of Java as a popular and widespread software platform for real applications that demand the best performance out of the underlying hardware. It's good news for the developers of mobile Java applications, as it will give them more options for innovation in performance hungry multimedia applications..."

  • [June 17, 2003] "Manipulate XML Data Easily with the XPath and XSLT APIs in the .NET Framework." By Dino Esposito. In Microsoft MSDN Magazine (July 2003). "XPath is emerging as a universal query language. With XPath, you can identify and process a group of related nodes in XML-based data sources. XPath provides an infrastructure that is integral to XML support in the .NET Framework. The XPath navigation model is even used under the hood of the XSLT processor. In this article, the author reviews the implementation details of the XPath navigator and the XSLT processor and includes practical examples such as asynchronous transformations, sorted node-sets, and ASP.NET server-side transformations... He analyzes XPath as the language of choice for performing XML queries in managed applications and discussed several aspects of its implementation. In the .NET Framework, the XPath runtime provides a common infrastructure for other pieces of the puzzle -- the first of which is XSLT. He also examines key aspects of the XSLT processor and supplied a few interesting applications of it such as asynchronous processing and the ASP.NET control... One of the key advantages of XML is that it lets you mark portions of text with tags and attributes. You tag data inside text because you plan to retrieve it later. So how do you do this? You use XPath. Though it doesn't have an XML-based syntax, XPath is the language defined to address parts of an XML document in a compact, relatively straightforward way. More importantly, XPath defines a common syntax so you can retrieve nodes from within classes that implement the XML document object model (DOM) as well as from XSLT. In the Microsoft .NET Framework, the XPath query language is fully supported through the classes defined in the System.Xml.XPath namespace. The .NET Framework implementation of XPath is based on a language parser and an evaluation engine. The overall architecture of an XPath query is similar to that of a database query. Just as with SQL commands, you prepare XPath expressions and submit them to a runtime engine for evaluation. The query is parsed and executed against an XML data source. Next, you get back some information representing the resultset of the query. An XPath expression can return a node-set (that is, an ordered collection of nodes), a Boolean value, a number, or a string. In this article, I'll show how XPath integrates with the XmlDocument class and XSLT in the .NET Framework. I'll also dig out the intricacies of the XPathNavigator class, used by the .NET Framework to walk through XML documents..."

  • [June 17, 2003] "X2EE." By By Steven P. Punte. x2ee White paper. June 2003. "It's all about abstraction: conveying the most concepts with the fewest number of words... Programming software languages are artifacts of our ubiquitous Von-Neumann CPU architectures expressing their control constructs originally with binary machine languages and evolving from there. The question still remains: 'What is software programming?' In other words, is the purpose of a software language to instruct a machine? Or is to capture a solution? This article stipulates that the evolutionary destination of software programming languages is to capture business problem solutions (here the term 'business' is abstract to mean any software application problem) in the most concise form possible. Simply put, it is what we want computers to do. Very few people are actually concerned with symbolic variables, stack conventions, encoding formats, etc... Instead the true purpose of software is to simply express solutions. X2EE is an evolutionary step towards this goal. X2EE is not yet a standard nor a body of software you can download, rather it is a crystallizing vision of where the current omnipresent XML-Centric trend is leading. In this article, we explore how emerging software platforms are already proceeding down the path of expressing entire software solutions to an ever-greater extent in XML... X2EE is a next generation software vision that relinquishes the object-oriented paradigm and adopts a pure XML perspective for software design and solutions. All data is viewed as XML documents, and all processes are viewed as XML transformations. The common denominator throughout this framework and consistent with many software efforts is the 'XML Message Event' -- an XML document being delivered from one software component to another as a event. Like its earlier procedural language namesake cousin J2EE, X2EE's target market is enterprise business-caliber solutions. Unlike its predecessor, X2EE is not an extension of a procedural software language, but rather a bottom-up approach based on the patterns and success of J2EE. In a nutshell, X2EE is: (1) An anthology of standardized XML Component Definitions whose behaviors are specified by XML business logic documents: i.e. 'XML Directed Software.' (2) A framework on which XML centric components are assembled. (3) A collection of sufficient open definitions and standards for third-party vendors to develop compliant components and frameworks and for users to construct portable multi-vendor solutions... XML is serving as a vehicle for more condensed, abstract, and expressive solutions for a wide variety of products and platforms. However, while there is significant effort underway... to develop agreed-upon messaging ontologoies, on the whole there is not yet any wide spread accepted XML Component standard or a Component Assembly framework. An opportunity exists to unify this new abstraction domain..."

  • [June 17, 2003] "Becoming NetCentric: Leveraging an Information Network of Communities of Interests, Architectures, and Ontologies." By Bruce Peat (eProcess Solutions). 7 pages. Also available from the Defense Finance and Accounting Service website. "Organizations need to take information asset management up a notch. To do this, they need to be opportunistic with their information. Organizations need to take a closer look at how to foster communication among users by promoting the clustering of contextbased communities or Communities of Interests (CoIs). Collectively CoIs are a critical information network topology that reach beyond the scope of the organization to include partners and stakeholders. CoIs allow information to scale globally, persist indefinitely and be distributed to almost any community for adoption. With a netCentric approach, these communities are scoped and managed, allowing for a scaleable alternative to today's typical broad swipes of enterprise architecture and standard language development. These communities allow for alignment of concepts by leveraging the common features and mitigating the differences within a proper size and scope... With the netCentric view, the metadata market is a network of CoIs all interacting in the Network Economy. If your organization has joint ventures, partnerships, outsourcing arrangements, licensing agreements, and/or supply-chain exchanges defined as extended relationships, then your organization should begin to move toward managing your information resources with the network economy model. In this model the individuals are partners establishing relationships as part of the information value-chain. This shift in thinking to this netCentric view is critical in understanding alliances within the organization's real environment. In this model, the metadata supporting the CoIs and business agreements or contracts takes prime importance... Synthesis as opposed to analytical decomposition is a particularly critical part of strategic thinking. Synthesis permits the discovery of the whole that is greater than the sum of the parts. The Business-Centric Methodology (BCM) exploits the synthesis of Communities of Interests, architectures and ontologies to harness tacit knowledge to facilitate communication, sharing and innovation. Understanding how to use this synthesis and the steps outlined to extract order from an organization's chaos through a methodology in a proactive rather than a reactive manner is a means to an organization's success..." See: (1) OASIS BCM TC website; (2) "Business-Centric Methodology" (3) "OASIS Forms Business-Centric Methodology Technical Committee."

  • [June 17, 2003] "XML and Unicode: Mix With Care." By Paul Festa. In CNET News.com (June 16, 2003). "The character set that lets computers write in every language from Czech to Chinese could make Web browsers tongue-tied, two standards groups warned... Published by the Unicode Consortium, Unicode is a standard character set for computers that aims to assign a number for every character in every written language. XML (Extensible Markup Language), a World Wide Web Consortium (W3C) recommendation for marking up digital documents and creating new markup languages for specific tasks or industries, relies on Unicode and closely tracks its revisions. But a technical report released by the Unicode Consortium -- and simultaneously published as a note by the W3C's internationalization activity -- warns document authors that some Unicode features are going to cause XML applications, HTML browsers, and other programs to choke. Conflict arises between Unicode and Web markup languages from the fundamentally different philosophies that underlie the character set and Web standards. While Unicode produces a one-for-one, linear correspondence for every character on the page, XML and its Web-based relatives are more flexible in that they let authors assign different style and functional attributes to a single character, word or page. For example, Unicode provides what's called 'compatibility characters,' separate numbers to designate superscript or subscript numerals or letters. With HTML or XML, by contrast, the author would use the basic character and then style it as superscript or subscript. All things being equal, the W3C advises authors to use the markup alternatives. Compatibility characters are 'just not the long-term, sound way to do things,' said Martin Duerst, the W3C's internationalization activity lead and a visiting scientist at the Massachusetts Institute of Technology's Laboratory for Computer Science. 'We're urging authors to use Unicode in a responsible and adequate way when it's used with XML.' Many times, authors know that their Unicode is destined to be read by Web browsers and other XML applications. But some of the conflicts crop up as a surprise when XML applications are fed information from older databases and information repositories..." See details in the news story: "Updated Unicode Technical Report Clarifies Characters not Suitable for Use With Markup."

  • [June 17, 2003] "OASIS Provisioning Services Technical Committee SPML V1.0 Interoperability Event." Technical and Operations Plan. Edited by Darran Rolls (Waveset Technologies) for the OASIS SPTC. Committee Working Draft. 06-June-2003. 20 pages. "This document describes the message exchanges to be tested during the Burton Catalyst interoperability event in San Francisco, July 9-11, 2003. This interoperability test is designed to show the interoperation of service subscription and provisioning based on the draft SPML V1.0 specification. This interop event is based around a defined scenario intended to test the interoperability of different implementations performing a common set of SPML operations, to test the soundness of the specification and clarity the mutual understanding of its meaning and application in a given business scenario. Note the scenario and context of this interop is not intended to represent a definitive implementation of the SPML V1.0 specification... The interop scenario is based on interactive attendee participation. Interop Users (IUs) will be directed through a defined scenario, in which they input 'New Hire' user data into a PeopleSoft HRMS system. This action will cause a set of SPML protocol exchanges to create service subscriptions at each vendor station participating at the interop. The business scenario is based around a fictional company SPML Contractors Inc. When a new employee starts at SPML Contractors, an SPML enabled system is used to manage account subscriptions with a defined set of SPML Contracts' customers. New employees are added to the SPML Contractors PeopleSoft HRMS using the standard PeopleSoft web based interface. The creation of records within HRMS is used to trigger SPML service subscription requests to be sent to each PV at the interop. In this scenario PeopleSoft HRMS will be acting in the role of SPML Contractors Inc. and will be functioning as an SPML Requesting Authority (RA). Mycroft will be providing an integration 'SPML multiplexer' module that takes the SPML request from PeopleSoft and creates individual SPML service requests for each of the PVs. Each of the PVs will be modeled as SPML Contractors Inc customers and will receive, process and respond to their own service requests in accordance with their own systems models and PSP/PST implementations... The SPML Contractors Inc PeopleSoft HRMS installation will be running a centralized server, accessible and available to all of the PVs. By employing the PeopleSoft HRMS web based user access model, new SPML Contractors Inc employees will be able to be added from any of the workstations at the interop event room. This will prevent a bottleneck from forming at the PeopleSoft workstation and allow an IU to approach the scenario from any PV, thus making more staff available to help IUs with questions and generally spread the traffic more evenly across the event..." See also: (1) "OASIS TC Releases Committee Specifications for Service Provisioning Markup Language (SPML)"; (2) the Burton Catalyst Interoperability Event Overview; (3) general references in "XML-Based Provisioning Services."

  • [June 17, 2003] "OWL Web Ontology Language XML Presentation Syntax." By Masahiro Hori (Kansai University, formerly IBM Tokyo Research), Jérôme Euzenat, (INRIA Rhône-Alpes), and Peter F. Patel-Schneider (Bell Labs Research, Lucent Technologies). W3C Note 11-June-2003. Latest version URL: http://www.w3.org/TR/owl-xmlsyntax/. This document from the W3C Web Ontology Working Group "specifies XML presentation syntax for OWL, which is defined as a dialect similar to OWL Abstract Syntax [OWL Semantics]. It is not intended to be a normative specification. Instead, it represents a suggestion of one possible XML presentation syntax for OWL. The OWL language provides three increasingly expressive sublanguages: OWL Lite, OWL DL, and OWL Full. This document provides XML Schemas for XML presentation syntax corresponding to the three sublanguages. Each of the sublanguages is an extension of its simpler predecessor, and the following relations hold but the inverses do not. [That is:] Every document valid against the XML Schema of OWL Lite is valid against the XML Schema of OWL DL; Every document valid against the XML Schema of OWL DL is valid against the XML Schema of OWL Full. Three Schemas are defined for the sublanguages: OWL Lite, OWL DL, and OWL Full; definitions of these Schemas are given in Appendix A..." Appendix B provides OWL Examples in XML Syntax. General references in "OWL Web Ontology Language."

  • [June 17, 2003] "Two Modes of Implementing an XML-Based Localization Pack: Embed and Extend. A Globalization Technique for Supporting Multiple Languages." By Bei Shu (Globalization Certification Laboratory, IBM China Software Development Lab). From IBM developerWorks, XML Zone. June 2003. ['IBM software engineer Bei Shu shows you how to enable multiple language support in your Web applications using different XML technologies from the architect perspective. She presents two approaches to implementing XML-based localization pack managers using XPath and XSLT -- embed and extend'.] "The localization pack is one of the key elements in the globalization architecture. XML is the recommended source form for localization packs because it is cross-platform, Unicode-encoded, and flexibly structured. This article presents two approaches to implementing XML-based localization pack managers using XPath and XSLT: embed and extend. With embed, the localization pack manager module is embedded in the main programs, while with extend, the localization pack manager module works outside the main programs. Both have their advantages and can be applied to actual development according to different requirements... In today's global e-business environment, businesses need to support customers and suppliers from many different countries. It is no longer sufficient for software to support one language and territory at a time, as defined by the former national language support model. In the software world, the trend is toward globalization, especially for e-business systems... A localization pack is a standardized approach for software to support multiple languages and locales through a single executable. Language- and culture-dependent elements are separated from the core logic of the software at the source code level, as well as the compiled and static-linked module level. The language and culture-independent portion is called the core module, and the language- and culture-dependent portions are called localization packs. In situations where the software is required to support multiple locales simultaneously, the software needs locale-dependent services that can be switchable based on a locale-ID that's designated explicitly by the user or implicitly by the application... Because a localization pack needs a platform-independent format and an all-in-one character repository, IBM's globalization organization recommends using XML as the source format for localization packs. XML is cross platform and Unicode encoded, and thus capable of holding multilingual data without data loss. It also provides a flexible, tree-like structure to accommodate the need for various kinds of structured locale data. XML is a W3C recommendation and an Internet standard, so it can meet most Web application requirements..." See "SGML/XML Markup and Multilingualism."

  • [June 17, 2003] "Enterprises Take Early Lead in Web Services Integration Projects." By Carol Sliwa. In Computerworld (June 16, 2003). "It's not hard to find companies that have dipped their toes into the water to explore how Web services might help address some of their nagging integration problems. But few have launched major initiatives of the scope at Eastman Chemical Co. and Merrill Lynch & Co. One of the distinguishing characteristics that separates these early adopters from the mere dabblers is the systematic approach they take to building the sort of service-oriented development architecture that experts say they'll need to realize the full benefits of Web services. Some of the biggest challenges they've faced so far have been finding the right tools and establishing best practices without a well-established road map... Eastman Chemical, a Kingsport, Tenn.-based maker of chemicals, fibers and plastics, is plotting the rollout of a service-oriented architecture across key legacy systems to give users more visibility and control over their business processes. Like a number of other companies, Eastman got started with Web services by focusing on a key project that would help its IT department learn about the new technology. Developers created a simple read-only Web service to give customers access to technical data in its product catalog. The product catalog Web service, which went live about a year ago, eliminated the need for customers to screen-scrape data from Eastman's site or to download a monolithic catalog to spreadsheets. Customers instead can now go to the Web site and make a request that causes the system to send an XML-based message using SOAP over HTTP to Eastman's Saqqara Systems Inc. database. The latter then does the data retrieval and sends back the information via XML and SOAP... Integration headaches drove Merrill Lynch to turn to Web services about one year ago as a cheaper and more efficient alternative to the middleware it uses to enable its thousands of mainframe applications to talk to its middle-tier and Web-based front-end systems. The challenge confronting the New York-based financial services company is far more expansive than most companies will ever encounter. Merrill Lynch has 23,000 CICS programs running on its mainframes, and it's very difficult to integrate those programs with Microsoft's .Net development platform, IBM's WebSphere or any other platforms or tool sets, notes Jim Crew, director of the infrastructure and data services group. Exposing those CICS applications with language-agnostic Web services interfaces and sending data using interoperable XML held great appeal. So Merrill Lynch created its own tool set, called X4ML, to help its mainframe programmers build interfaces and run Web services without need of XML, Java, Visual Basic or Web services skills... So far, Merrill Lynch has used X4ML in more than 20 applications running in production and several more in development. The tool has also been used to build Web services interfaces to about 350 CICS programs during the past year, according to Crew. The company has created an X4ML directory with capabilities similar to the UDDI standard to document its Web services, but it plans to migrate to UDDI later this year..."

  • [June 17, 2003] "IBM Sprinkles Cinnamon on Content Manager. Big Blue to Bolster XML in Forthcoming Version." By Tom Sullivan. In InfoWorld (June 13, 2003). "IBM is working on ways to make XML documents and data easier to pull into its content management software, and to index and search the data once it is in there... Until now, within content management and IBM's DB2 database the handling of XML documents has been focused on being able to receive XML documents that are set in different DTD schemas and have them be, in effect, mapped into rows in a database; so that is kind of parsing, extraction, and flattening action to be able to take XML documents from different sources and have them be added in with values out of the XML documents populated into certain columns, [Jim Reimer] said... In IBM's latest Content Manager, Version 8, the company made extensions to what could be represented in a data collection, such as the primitives, the data modeling services, or whatever can be expressed in an XML document, including multi-valued attribute sets, arbitrary hierarchy, links, and relationships. 'The challenge if you have such documents is how to get them into CM and, secondly, how to deal with the landscape where you have evolving DTDs and schemas over time and different authors, writing in different DTDs and schemas, that are giving you content,' Reimer explained. The underlying technology aimed at this mapping, administration, and adaptation problem of dealing with evolving schemas is a project also within IBM research, dubbed Clio, and part of the overall eXperanto effort. Cinnamon, then, is IBM's effort to extend that technology base to permit users to take complex XML documents, whatever might be expressed in an XML document and the associated DTDs and schemas, and then manage the oversight of the mapping task that defines how to project that into the full data modeling services of CM. Secondly, from a runtime perspective the goal is to handle the automatic ingesting of those documents including all the parsing, extraction, and projection into the new data model, Reimer explained..."

  • [June 17, 2003] "IBM Spices Up Its Content Manager." By Lisa Vaas. In eWEEK (June 17, 2003). "IBM next year will release a tool 'code-named Cinnamon' that will enable its Content Manager software to more easily index and search XML data. Cinnamon is a result of Clio, an ongoing research project at IBM's Almaden Research Center that aims to develop a tool for automatically creating mappings between different forms of data. Cinnamon will allow users to define how XML documents will get mapped into IBM's DB2 database, making it easier to store and manage content, according to IBM executives. It will ship as part of the tools package that arrives with Content Manager's next release, which does not yet have a version number or name, said IBM executives... Cinnamon's raison d'être has to do with current problems in placing queries to XML-tagged data. Querying now requires proprietary code that either doesn't take full advantage of the XML format or cannot be used consistently, said IBM executives... Tools like Clio, [IBM's Jim] Reimer said, have focused on taking XML documents, peeling out searchable properties and then populating those properties into rows in a database. Cinnamon takes that one step further, permitting users to automatically map documents and send them into a relational database or vice versa. It does that by automatically placing an XML document into a row in a database and, conversely, by taking terms out of an XML database and populating them into columns in a relational database..."

  • [June 17, 2003] "Developers Expect Nod for UML 2.0 Standard." By Darryl K. Taft. In eWEEK (May 26, 2003). "The Object Management Group will meet in Paris next week to vote on Version 2.0 of Unified Modeling Language, a language that supports analysis and design in a variety of tools and promises to open new horizons for developers. The first UML 2.0 specifications were adopted as OMG standards in March -- covering Infrastructure, Object Constraint Language and Diagram Interchange Protocol. A fourth specification, Superstructure, is expected to be voted on at the meeting next week, completing the recommendation process for the latest UML version. Few developers will be looking forward to UML 2.0 more than IBM. Sridhar Iyengar, a Distinguished Engineer with IBM, in Raleigh, N.C., and a member of the OMG Architecture board, said IBM researchers are looking into several innovations using the new specification. IBM will be looking to build a UML profile for testing. This work will lead to 'using modeling not just for analysis and design but for testing,' Iyengar said. 'We expect this technology will become a standard,' he said. IBM's approach to modeling signals a race with Microsoft Corp., which is warming up to the OMG for similar purposes. Microsoft will support modeling in its upcoming Jupiter e-business suite, which will compete with IBM's WebSphere... In addition to its use of the MDA (Model Driven Architecture) specification, IBM is pushing toward a new area, which Iyengar calls Model Driven Business Integration, while the company also has a focus on model-driven tool integration and model-driven application development, he said. MDA allows developers to design, build, integrate and manage applications throughout the life cycle while separating technology and business concerns, Iyengar said. EMF (Eclipse Modeling Framework) is the glue that holds together IBM's modeling strategy. 'EMF is the technology that unifies the world of modeling in WebSphere and DB2,' Iyengar said. 'The use of EMF will increase within IBM and externally,' among members of the IBM-sponsored Eclipse.org organization, which oversees the Eclipse open-source development platform, he said..." General references in "OMG Model Driven Architecture (MDA)."

  • [June 17, 2003] "UML 2.0 Full Suite: MetaObjects Standards Complete." By Vance McCarthy. In Integration Developer News (June 13, 2003). "Modeling may get easier for developers to implement and support, as the full suite of upgraded standards was adopted last week by the Object Management Group, a collection of more than 50 software vendors focused on object and XML-based modeling. OMG members recommended adoption of the final piece to the major UML 2.0 (Unified Modeling Language) upgrade. This last piece, called the Superstructure specification, completes the definition of the UML 2.0, which has been in its final phase since this spring. The UML 2.0 recommendations for the other three (3) main components -- Infrastructure, Object Constraint Language and Diagram Interchange Protocol -- were done in March 2003. In addition, OMG recommended new MetaObject Facility (MOF 2.0) specifications for the MOF Core and XML Metadata Interchange (XMI) to update the repository foundation upon which UML tools are built. According to an OMG statement, 'Alignment of the UML 2.0 metamodel with the MOF metamodel will simplify model interchange via XMI, and cross-tool interoperability.' More than 50 companies participated in the UML upgrade..." General references in "OMG Model Driven Architecture (MDA)."

  • [June 17, 2003] "Automating Interoperability Testing." By Demir Barlas. In Line56 E-Business News (June 17, 2003). "The Drummond Group, a company that specializes in offering interoperability testing for various integration and document exchange software vendors, has debuted a testing product called InSitu designed to make that testing easier. [Drummond is] making life easier for e-business software vendors who want to conform to AS2 testing... The problem the Drummond Group addresses is simple: When a supplier involved in the supply chain of a large, important partner (say, Wal-Mart's suppliers) is directed to use EDI-INT AS2 (a current standard for Internet-based electronic data interchange) or ebXML, they have to make certain that they chose a vendor whose product is interoperable with the partner's technology stack and perhaps with other products in the supply chain. However, suppliers don't have to bother with the testing, as their partners, software vendors, and standards bodies conduct interoperability tests. InSitu is designed for the use of vendors -- so far including bTrade, Inc., Cleo Communications, Cyclone Commerce, Hewlett-Packard and IPNet Solutions, Inc -- who want to be ready for the next round of AS2 interoperability testing as sponsored by the Uniform Code Council and the HIPAA Conformance Certification Organization... Versions of InSitu for AS2 have been beta tested while tests for ebXML Messaging are in the works..." Note: according to the company announcement, Drummond Group Inc. "expects the InSitu interoperability system will reduce the software testing companies' internal costs to approximately one-tenth of what it was before automation. 'As a major endeavor for DGI, InSitu will have a positive impact on industries supporting B2B commerce,' said Rik Drummond, DGI's chief executive officer and chief scientist. 'A remarkable number of software products and services are asking for certification, and this cutting-edge technology will allow numerous products, services and time zones to be interoperable. InSitu revolutionizes the process as it accelerates further adoption of new technical standards to support an interoperable global marketplace.' InSitu technology introduces a new level of flexibility in interoperability testing. With less time and staffing needed for testing, InSitu technology automatically reacts to test commands defined for each scenario..." See details in "Drummond Group Unveils InSitu, the First Automated Interoperability System for Global Testing Support. InSitu Designed to Decrease Testing Hours, Related Costs for Software Testing."

  • [June 10, 2003] "Uniform Resource Identifier (URI): Generic Syntax." Edited by Tim Berners-Lee (World Wide Web Consortium), WWW; Roy T. Fielding (Day Software), WWW; Larry Masinter (Adobe Systems Incorporated), WWW. IETF Network Working Group, Internet-Draft. Reference: 'draft-fielding-uri-rfc2396bis-03'. June 6, 2003, expires December 5, 2003. 60 pages. Updates: RFC 1738 (if approved). Obsoletes: 2732, 2396, 1808 (if approved). Appendix D: Summary of Non-editorial Changes. Discussion of this draft and comments to the editors should be sent to the uri@w3.org mailing list; see the public archive. An issues list and version history is available online. Abstract: "A Uniform Resource Identifier (URI) is a compact string of characters for identifying an abstract or physical resource. This specification defines the generic URI syntax and a process for resolving URI references that might be in relative form, along with guidelines and security considerations for the use of URIs on the Internet. The URI syntax defines a grammar that is a superset of all valid URIs, such that an implementation can parse the common components of a URI reference without knowing the scheme-specific requirements of every possible identifier. This specification does not define a generative grammar for URIs; that task is performed by the individual specifications of each URI scheme." [cache]

  • [June 10, 2003] "SOAP 1.2." By Rich Salz. From O'Reilly WebServices.xml.com (June 10, 2003). ['SOAP is a technology that has had much attention and been through numerous revisions. Now that SOAP 1.2 has reached "Proposed Recommendation" status at the W3C, it's all almost over. Rich Salz guides us through SOAP 1.2, highlighting the major changes and improvements from SOAP 1.1.'] "SOAP 1.2 has reached Proposed Recommendation (PR) status, which means that the W3C XML Protocol Working Group believes that it's done. A few PR issues have been raised. It can be fun to look at the list, as the issues range from 'I found a typo' to 'this isn't full XML, send it back to the committee'. If you're pedantically inclined -- and everyone in the computer field must have at least some inclination in that direction -- now is the time to break out the coffee and the reading glasses in order to find issues of your own. Within the next week or two, we should know if Tim Berners-Lee approves the PR, making it a formal W3C Recommendation (i.e., a standard in everything but name); or, if he identifies issues that need to be resolved, requiring another round of work. The results could be interesting: there are, as usual, a number of political issues involved... The biggest difference between SOAP 1.1 and SOAP 1.2 is that the 1.2 specification is built around the Infoset... The descriptions SOAP message processing are no longer based on syntax but on the information that the message carries. An implementor can determine what information items are important and must be preserved. It's now feasible to understand how to 'tunnel' a SOAP message safely over any appropriate transport. Following the logic, you can know see that SOAP is now much more transport-neutral. With any luck we'll see a wide variety of environments supporting SOAP messages in a very powerful way; as much as possible, they will look like native messages within the environment, but they'll be able to cross that environment and be safely converted to 'classic HTTP/SOAP' without any information loss. In other words, I can have HTTP-based servers at each end of a processing pipeline, but have MQ, SMTP, CORBA/IIOP, or other intermediaries doing parts of the processing along the way. If the middleware vendors follow through on that promise, we really will have a universal distributed messaging infrastructure. I'm actually fairly optimistic about this, since it seems the only way that proprietary middleware stands a chance of not being wiped out completely by HTTP plumbing and WS-xxx headers commoditizing their value-add for items like reliable transport. The relationship between SOAP, MIME, and HTTP has been cleaned up. Most SOAP developers will probably never notice this, but it's a good thing Hopefully SOAP vendors will make it easy to generate and parse these new formats: having multilingual server error messages can be a big win, although knowing what language to output might require close integration with the transport layer. From the developer's perspective, however, at least initially the major difference between SOAP 1.1 and SOAP 1.2 faults is that SOAP 1.2 does not use HTTP status code 500 (server error) to indicate a fault. As far as HTTP is concerned, a fault is a normal HTTP response message and is an accommodation to the principles of REST. It will have interesting implications for programmers who write their own SOAP stacks..." See details in "W3C Publishes SOAP Version 1.2 as a Proposed Recommendation." General references in "Simple Object Access Protocol (SOAP)."

  • [June 10, 2003] "Structured Writing, Structured Search." By Jon Udell. From O'Reilly WebServices.xml.com (June 10, 2003). ['Jon Udell explores further some ideas he's been pursuing with regard to structured search of content marked up with XML. Instead of making these searches web services, relying on the XPath or XQuery interfaces to databases, Jon experiments with moving the querying to the client side, and doing it with JavaScript and XSLT in a web browser.'] "... In 'The Semantic Blog' I suggested that we could achieve much more with simple XHTML content than we currently do. Two months down the road, the picture's a bit clearer than it was. In that column, I said that I was going to start including an <xhtml:body> element in my RSS feed. Now that it's been in place for two months, I admit it hasn't been entirely smooth sailing. I did work out how to use HTML Tidy to clean up the stuff I post through Radio UserLand. But in the end, that's not quite satisfactory. If the goal is to produce clean XHTML, you want more interaction than Tidy affords. Currently I wind up checking my work in an XML-savvy browser: IE or Mozilla. I'd like to be able to toggle between XML and HTML modes, but haven't sorted that out yet. We are still in desperate need of lightweight WYSIWYG editing components that make XHTML useful to non-emacs-using (i.e., normal) people... I keep hearing about new efforts -- most recently, Mozile -- but so far, I've seen nothing that delivers the user experience I enjoyed back in 1996 in the Netscape mail/news client... With all the capability packed into modern browsers, it struck me that we ought to be able to use XPath much more simply and interactively. So I took another look at my OSCOM slideshow and added an XPath search to it... Sooner or later, I'll be using a real XML database to enjoy this level of control over the XHTML content I post to my weblog and that others post to theirs. With a little luck, I won't have to provide that service myself. Somebody will build one that latches onto my XHTML feed and others..."

  • [June 10, 2003] "How Web Services Helped Israel Deal With Bad Drivers." By Whit Andrews. Gartner Research Note. Case Studies, CS-19-9807. June 03, 2003. ['Israeli courts used Web services to collect unpaid traffic fines. The project was tied to an amnesty period, so the system had to be built quickly. The payoff for two weeks' work was an inflow of several million dollars.'] "The Israeli Judicial Authority found itself with a project that was clearly appropriate for Web services. The authority wished to convert a number of outstanding traffic fines to cash. Motorists were to be given a period of leniency where they could pay their fines at a reduced rate, and with no questions asked. These fines could be paid for online through the authority's Web site. The period was to be limited, which demanded quick development. The scalability of the project was important, because it was sure to generate a peak of activity at the beginning and possibly another at the end. Web services can cope with scaling issues, but they require greater support to do so when file sizes or access volumes grow. However, in this case, file sizes would remain small. This was therefore an ideal project for Web services... Three major factors led the Judicial Authority to select a strategy based on Web services: (1) The project was time critical but the systems integrator was confident that Web services could be implemented quickly (2) The authority expected to use the interface again for other applications, so it was important to use standards as much as possible (3) The data to be exchanged between applications could be sent in very small files -- 4KB or less A Web site was the easiest and the most convenient way of providing users with direct access to the relevant data. The Web site was set up and an application sequence put in place. Consultants from IBM Global Services completed the project in two weeks. A Web site user keys in his or her national ID number, along with other relevant numbers to confirm identification. The resulting datum is generally fewer than 30 characters, which is passed to a proxy object. That object issues a Simple Object Access Protocol (SOAP) call to a SOAP listener defined in a Sabratec ApplinX legacy integration server, and documented in a Web Service Definition Language (WSDL) file, which then executes a procedure to verify the user... More than 1 million users have accessed this data and 85,000 have used the new system to pay fines. Initially, the Web site received up to 30,000 hits a day and as many as 5,000 hits per hour and the installation achieved this scale effectively..."

  • [June 10, 2003] "Connecting with Java Web Services. BEA WebLogic, IBM WebSphere, JBoss, and Sybase EAServer Plug In to the Next Wave of Web Apps." By Oliver Rist and David Aubrey. In InfoWorld (June 06, 2003). Review. "Commercial J2EE platform vendors have also jumped on the bandwagon, weaving special Web services capabilities into their development tools and application servers. How important are these features in smoothing the creation and deployment of Web services? How real are their advantages over open source? Which J2EE application server is the best platform for Web services? To find out, we rounded up the two leading commercial J2EE servers, BEA Systems' WebLogic and IBM's WebSphere, plus a solid also-ran, Sybase's EAServer, and the most popular open source J2EE server, JBoss, and we put them to the test. Deploying Web services on each of these platforms, we evaluated their related management capabilities as well as their support for the core Web services standards, SOAP, XML-RPC (Remote Procedure Call), WSDL, and UDDI. We also looked for flexible configuration and a set of features, such as support for JMX (Java Management eXtensions), JNDI (Java Naming and Directory Interface), JMS (Java Messaging Service), and JTA (Java Transaction API), that we would expect in any enterprise-class Java platform. In our test scenario, we implemented a multi-tier supply chain composed of four Web services. One service allowed a retail customer to buy a product from a retailer. A second service allowed the retailer, in turn, to purchase wholesale goods from a supplier. A third allowed the wholesaler to purchase raw materials from a parts supplier. Finally, a fourth service allowed all of these parties to track their shipments. We coded all of the business logic in Java, and created adapters to implement each component as a Web service... Deploying, modifying, redeploying, and ensuring that your Web services are always available are the key ingredients of a successful Web services recipe. Therefore, our test focused on how smoothly our four solutions handled these tasks... For our tests, we developed four Web services using Metrowerks CodeWarrior and Eclipse. We wrote the business logic and business objects in Java. Then, to implement each component as a Web service, we created adapters to the business layer. Using compliance verification tools from IBM and The Mind Electric, we made sure that all our test applications adhered strictly to the four core Web service standards: SOAP, XML-RPC, UDDI, and WSDL..."

  • [June 10, 2003] "Java and the Model Driven Architecture." By Peter Varhol (Compuware Corporation). In JavaPro Magazine Volume 7, Number 6 (June 04, 2003), pages 40-41. ['Applications whose architectures are tied closely to supporting dynamic technologies don't have to become obsolete before their time. Employ MDA's core modeling techniques to separate architecture from implementation and extend the shelf life of distributed applications.'] "One software development error that has long-term ramifications, especially for large distributed applications, is that the application architecture is often written with a specific hardware and operating system set in mind. This situation may present no problem initially, but over time both hardware and operating systems change, often in unpredictable ways. Then there's other software you may depend on, such as the application server, browser, or database management system. In many cases, these applications tend to be even more dynamic than the underlying hardware and OS. The end result is that underlying technology changes often require significant changes in the application for it to continue working. In some cases, the architecture of the application is tied so closely to its supporting technologies that as these technologies change, the entire application has to be scrapped and redesigned. Enter MDA, the Model Driven Architecture. MDA is a way to separate the architecture of an application from its implementation. By doing so, its proponents hope that changes to supporting software and hardware won't render an otherwise still-useful enterprise application obsolete before its time. More important, by decoupling the application architecture from its execution environment, MDA could result in better designs that will have a longer useful life and can more easily be ported to other underlying platforms. As you might expect, MDA is based on the Unified Modeling Language (UML), along with the Meta-Object Facility (MOF) and Common Warehouse Metamodel (CWM). In addition, MDA consists of a few core models, or profiles -- for enterprise development and another for real-time development (real time in the sense that hardware/software systems must have predictable response times). Other core models will be developed and offered up as standards as time progresses. The entire MDA structure and process is a standard of the Object Modeling Group, the same standards body that maintains the CORBA and UML standards... UML lets developers work at a higher level of abstraction, while letting software do the detail work. In the past, this was good enough for producing generic code or for targeting a specific platform and set of underlying technologies. MDA represents the next stage in application development, one that promises to further improve the sometimes-chaotic process of building J2EE applications..." "OMG Model Driven Architecture (MDA)."

  • [June 10, 2003] "XML Catalogs." OASIS Committee Specification 1.0. Edited by Norman Walsh (Sun Microsystems, Inc). June 03, 2003. 36 pages. Produced by members of the OASIS Entity Resolution Technical Committee. See the source PDF; also in HTML format. Other details provided in the news item: "OASIS Entity Resolution TC Approves XML Catalogs Specification for Public Review." Abstract: "The requirement that all external identifiers in XML documents must provide a system identifier has unquestionably been of tremendous short-term benefit to the XML community. It has allowed a whole generation of tools to be developed without the added complexity of explicit entity management. However, the interoperability of XML documents has been impeded in several ways by the lack of entity management facilities: (1) External identifiers may require resources that are not always available. For example, a system identifier that points to a resource on another machine may be inaccessible if a network connection is not available. (2) External identifiers may require protocols that are not accessible to all of the vendors' tools on a single computer system. An external identifier that is addressed with the ftp: protocol, for example, is not accessible to a tool that does not support that protocol. (3) It is often convenient to access resources using system identifiers that point to local resources. Exchanging documents that refer to local resources with other systems is problematic at best and impossible at worst. The problems involved with sharing documents, or packages of documents, across multiple systems are large and complex. While there are many important issues involved and a complete solution is beyond the current scope, the OASIS membership agrees upon the enclosed set of conventions to address a useful subset of the complete problem. To address these issues, this Committee Specification defines an entity catalog that maps both external identifiers and arbitrary URI references to URI references..." Lauren Wood on XML-DEV 2003-06-04: This document contains "a few editorial changes to the previous version of the Committee Specification. The Entity Resolution TC is doing an internal review for two weeks, then we'll start the formal process of applying for OASIS Standard... the TC thinks it's finished but the OASIS membership hasn't had a chance yet to say whether they think it is worth calling an OASIS standard..." Note also "XML Catalog Implementation on Unix-like Systems", edited by Mark Johnson, 23-April-2003 or later. The document provides a sample implementation based on the draft policy for the Debian GNU/Linux implementation of XML catalogs. A snapshot version was posted to the OASIS Entity Resolution TC web site by Lauren Wood because the TC is "considering writing an implementation guide or tutorial, and this [XML Catalog Implementation document] is input to that discussion..."

  • [June 10, 2003] "IBM Unites Enterprise Development. WSED 5.0 Neatly Supports Legacy Apps, Java, XML, and Web Services Projects." By Maggie Biggs. In InfoWorld (June 06, 2003). "In the majority of enterprises, legacy applications are still alive, well, and powering some of the most mission-critical business functions within the organization. With its integrated development environment, WSED 5.0 (WebSphere Studio Enterprise Developer 5.0), IBM has nicely blended development support for mission-critical legacy technologies, such as Cobol and PL/1, together with tools that support Web technologies. There are a number of benefits that can be derived from the all-inclusive nature of IBM's IDE. Developers can more easily maintain existing legacy assets using WSED. But that existing legacy code can also be extended and transformed into new or improved applications by blending legacy assets with Web-related technologies. Further, it saves companies the expense of purchasing multiple development tools to support legacy, Java, and Web technologies. WSED is a superset of WSAD (WebSphere Studio Application Developer), the IBM IDE that supports Java/J2EE, Web applications, XML, and Web services development. While WSED ($7,500 per processor) adds support for mainframe assets, WSAD ($3,499) is a solid IDE for organizations that don't need to include mainframe support. IBM also supplies a superset version of WSAD that supports its iSeries (AS/400) platform. I've tested the WSDC (WebSphere Development Studio Client for iSeries) and, as with WSED, have found that it too does a good job of supporting both existing midrange assets, such as RPG (Report Program Generator) code, as well as Web technologies... WSED nicely implements the open source Struts framework. Enterprise developers creating Web applications will find working with the Struts model, view, and controller paradigm a breeze. Companies that depend on legacy assets and leading technologies, such as J2EE and Web services, should consider deploying WSED as a way to gain the best of both worlds while minimizing costs..."

  • [June 10, 2003] "Microsoft Enhances FrontPage, SQL Server. Users Can Build Data-Driven Web Sites." By Paul Krill. In InfoWorld (June 10, 2003). "Microsoft this week announced efforts to support development of XML-based Web sites in Office FrontPage 2003 and to extend SQL Server database capabilities to more than 50 proprietary databases and files. Part of the Microsoft Office System, FrontPage 2003, will serve as a WYSIWIG Extensible Stylesheet Language Transformation Editor (XSLT) editor in which users can work with live data to develop interactive, dynamic Web sites, according to Microsoft. This streamlines the process of sharing information on the Web, according to the company. With the WYSIWIG editor, users can build XML data-driven Web sites connecting to XML files, Web services, and OLE DB data sources, according to Microsoft. It is no longer necessary to program with server-side scripting tools such as the Visual Basic development system, Visual C#, or Visual Basic Scripting Edition or Java to develop data-driven Web sites, according to Microsoft. FrontPage 2003 also offers coding tools such as a Split Screen view for viewing code and the design view simultaneously. Beta 2 of Microsoft Office FrontPage 2003 is available as part of the Microsoft Office System..." See: (1) details in the announcement: "Microsoft Reinvents FrontPage, Tapping Into the Power of XML To Build Live Data-Driven Web Sites. Microsoft Office FrontPage 2003 Provides More Power to Connect PeopleTo Dynamic Information on the Web." (2) the FrontPage 2003 Overview, Product Guide, and FAQ document.

  • [June 10, 2003] "FrontPage to Embed Full XSLT Editor, Blogging Tools." By Barbara Darrow. In CRN (June 10, 2003). "Microsoft isn't ready to ship FrontPage 2003 or any of the new Office Systems applications, but it continues to mete out feature info. FrontPage 2003, which Microsoft already has said will target professional Web developers, will incorporate a full WYSIWYG editor for Extensible Stylesheet Language (XSLT), the company said Monday. That editor promises to enable Web developers to more easily work incorporate live data into their Web sites, the company said... XSLT is typically used to convert data in XML format to HTML for display in Web browsers. Microsoft said EDS is using FrontPage's new capabilities to exchange XML documents with legacy engineering data in back-end change control systems. With such capabilities Microsoft hopes to attract developers of dynamic, data-driven sites. FrontPage was previously intended for novice Web-site designers. The new package will also include prebuilt functionality to ease creation of Web logs or blogs, Microsoft said. The goal of a iWave launch was to rationalize a group of related products that would help prove the worth of Windows Server 2003 and Office 2003 upgrades. Given all the staggered press releases, that strategy now appears somewhat tattered. Microsoft has been diligent about rebranding the various component applications with the Microsoft Office tag, but the well-orchestrated launch of all the offerings is not happening, observers said..." See references in the preceding entry.

  • [June 10, 2003] "UDDI Rises From Basement to Board Room." By Peter Coffee. In eWEEK (June 02, 2003). "It's important to understand that UDDI is not just an enabling technology for taking your existing IT stack and exposing relevant functions across departmental boundaries to integrate business processes. It's not even just a means of making your IT assets available to supply-chain partners, any more than your enterprise Web site is only there to serve up in-house reference documents to employees or provide online catalog data to your existing customers. Just as it's the job of your Web site today to attract, engage and serve new customers, so is it the potential of UDDI to elevate your offerings above the commodity fray by turning them into service-based solutions -- perhaps for customers you may never have known you were missing the opportunity to serve. UDDI 2.0, just ratified this month as the highest-level OASIS Open Standard, is already supported in platforms such as Microsoft's Windows Server 2003 and by Microsoft's Visual Studio .Net tool set. Microsoft's implementation of UDDI relies on Windows infrastructure, to be sure, but availability on the Windows platform often marks a technology's crucial transition from possibility to ubiquity -- followed soon by expectation that a facility will be provided as a matter of course... UDDI Version 3 will make UDDI keys more convenient to use (in much the way that DNS names, like www.eweek.com, are more convenient than simple IP addresses); it will incorporate digital signature mechanisms for greater confidence in using UDDI with external parties; and it will offer policy management, a subscription interface and other mechanisms for making UDDI a channel for marketing rather than merely exhibiting the services that an organization offers... Not everyone greets with joy the prospect of UDDI becoming the Main Street of the Internet: The existence of a UDDI registry, and its mediation by a complex standard with frequent revisions, looks to some people like a means of putting the effective definition of the Internet into the hands of a few self-serving providers. It's a concern worth keeping in mind, as the function of searching for facilities in a dynamic environment becomes ever more important to what we used to call an operating system -- but which may need a new name as it deals with a new, much less static set of resources..."

  • [June 10, 2003] "Introducing Examplotron: The Fastest Road to Schema." By Uche Ogbuji (Principal Consultant, Fourthought, Inc). From IBM developerWorks, XML zone. June 10, 2003. ['A zoo of XML schema languages is out there, and although some of the beasts are bigger than others none is as friendly as Examplotron. With Examplotron, your example XML document is your schema, for the most part. It requires you to learn very little new syntax, and most of the core features of XML can be specified by providing representative examples in the source. In this article, Uche Ogbuji introduces Examplotron, providing plenty of examples.'] "At first XML had the Document Type Definition (DTD). XML 1.0 came bundled with the schema technology inherited from SGML. However, numerous XML users complained about DTDs including the fact that they use a different syntax from XML itself. The W3C developed a successor technology to DTD, W3C XML Schema, but some complained that it was too complex, and that it showed every sign of design-by-committee. Separate groups developed schema technologies that became RELAX NG and Schematron. These technologies all have their strengths and weaknesses, and their attendant factions. But for the developer with deadlines to mind, crafting schemata is often too much of an additional burden. Without a doubt, it is always a good idea to develop a schema. If for no other reason, it provides documentation of the format. But in the real world, the most common course for harassed developers is to develop a sample of the XML format to serve all purposes of a proper schema. But what if the example itself could provide the benefits of a formal schema? In particular, what if the example could be used to validate documents? Eric van der Vlist set out to develop a system that allows example documents to serve as formal schemata, and his invention is Examplotron. In this article, I introduce Examplotron. This system is simple to use, so I encourage you to follow along by downloading Examplotron 0.7 (compile.xsl) and use your favorite XSLT and RELAX NG processors... On a recent project, a client who had many XML formats hired me, through my company Fourthought, to develop schemata for documentation and validation for these XML formats. All they had to start with were sample XML documents for each format. Using Examplotron to generate the production RELAX NG schemata from these sample documents saved me perhaps over a hundred hours of effort, and thus saved them tens of thousands of dollars. I did have to augment Examplotron with document generation and other refinement code; I hope to cover the non-proprietary aspects of this refinement code in a future article. Examplotron produces RELAX NG schemata, but if you must produce W3C XML Schema, all is still well: You can use James Clark's excellent Trang tool to convert RELAX NG to WXS. I know from my overall consulting experience that sample documents are the most common form of schema in the real world, so I expect that Examplotron will be of great help to a lot of folks right away." Also in PDF format.

  • [June 10, 2003] "Scripting with Jython Instead of XML." By Jonathan Simon. In Java Today (June 10, 2003). "XML is a popular medium for representing, transforming, and sharing data among applications. Common examples include configuration files for web applications, inter-application messaging using SOAP, and user interface transformation using XSLT. Some say that XML is so great because it's 'everything and nothing.' This has led to applications of XML to areas outside of its element. XML is often used as the basis of a custom scripting language. As a result, common programmatic constructs are duplicated and the code is difficult to understand; XML tools are not designed to support this. In this article, I will show a few examples from the Java community using XML as a scripting language. I will also show those same examples in Jython, in order to demonstrate the usefulness of a scripting language. There are many other Java-based scripting languages. Jython is simply Python implemented in Java (lagging a few releases behind). Jython and Java objects can communicate without any special bindings. This means Jython objects can extend Java objects and more. Jython syntax will feel familiar, with notable exceptions, such as: lines do not end with semicolons, and tabs, rather than curly braces, are used to delineate code blocks. This article does not suggest that Jython is better than XML, but rather that each is particularly suited for solving different problems. This article explores some common tasks performed with XML and how they might be better implemented in a Java scripting language. A future article will focus on the actual implementation of one of these tasks, showing in detail the benefits of Java scripting... My bias is that XML is a great tool that is misapplied when it is used as a programming language. You have seen a few examples where the XML markup/programming language boundary has been crossed, and where a scripting language like Jython may be a better alternative. Jython is easier to read and write compared to XML, because of its similar syntax to Java code. Additionally, Jython can communicate with all of your other Java code. So the only real difference between Java code and Jython is syntax, rather than the complete incompatibility between Java and XML. There is also some hesitation in moving from XML to scripting languages partly because of perceived limits in portability. A build script written in Jython must be rewritten to work in JRuby. If we switch from ANT to another tool, we can always write an interpreter that allows us to continue to use the existing XML scripts with our new tool. This is also a hesitation of developers to use Java scripting languages, since they are arguably less mature then Java and XML and have fewer development tools. But maybe, if we as a community begin to get vocal about when a Java scripting languages should be used, the open source community and our vendors will develop better and stronger tools to support Java scripting languages..." Note: Java Today is a publication of the Java.net portal; see the news item "Sun Microsystems Launches Java.net Portal for Java Technology Collaboration."

  • [June 10, 2003] "Design and Test of the Cross-Format Schema Protocol (XFSP) for Networked Virtual Environment." By Ekrem Serin (Lieutenant Junior Grade, Turkish Navy, B. S., Turkish Naval Academy, 1997). Thesis submitted in partial fulfillment of the requirements for the degree of Master Of Science In Computer Science from the Naval Postgraduate School. March 2003. 149 pages. Referenced by Don Brutzman in a posting to the W3C 'www-tag' list. The thesis presents a "description, design and implementation detail of Cross Format Schema Protocol (XFSP)... With this work we show that a networked simulation can work for 24 hours a day and 7 days a week with an extensible schema based networking protocol and it is not necessary to hard code and compile the protocols into the networked virtual environments. Furthermore, this thesis presents a general automatic protocol handler for schema-defined XML document or message. Additionally, this work concludes with idea that protocols can be loaded and extended at runtime, and can be created with different-fidelity resolutions, resulting in swapping at runtime based on distributed state... This general XML compression scheme can be used for automatic production of binary network protocols, and binary file formats, for any data structures specified in an XML Schema... Ordinarily, XML is not a compact way to express the data. Messages written in XML are much larger than a binary equivalent. The technique that is used to overcome this problem is replacing tags with binary tokens. When an XML tree is parsed to serialize into an output stream, the tags that mark up the data are replaced with their binary equivalents. The end result is a more compact serialized XML tree. As it is discussed before, the basic idea behind XFSP was XML-Serialization. With this approach, XFSP can be used in any application which needs transactions via XML documents such as XML-RPC (XML Remote Procedure Call), XKMS (XML Key Management Services), XML-DSig (XML Digital Signatures) and XML-Enc (XML Encryption). XFSP can present those transactions in a more compact way... For the XFSP project, semantics is not targeted to be solved and is generally considered to be NP-Hard, because the semantic definition needs a knowledge domain and AI generation. As described before, the run-time extensible syntax is pointed out as the research question and targeted to be solved. To solve this problem, XML Schema is used to define the application- layer protocol between users and serialized XML data is sent as the payload..." Other references in "XML and Compression." [cache]

  • [June 09, 2003] Java.net: The JCP Alternative? Critics Say Group is Losing Momentum." By Robert McMillan. In InfoWorld (June 09, 2003). "When Jason Hunter first began work on standardizing his libraries, called JDOM (Java Document Object Model), he believed that following the JCP seemed like a good way to make his software more popular, he said. As an official Java standard, it would have a greater chance of being included as part of Sun's Java Developer Kit or perhaps as part of the Java 2 Enterprise Edition (J2EE) specification, he said. But by the time he had settled the Apache dispute, Hunter was simply too exhausted to go to work on his own standard. Now, a year later, JDOM has become so popular on its own that he no longer sees a compelling reason to follow through with JCP standardization, he said. Hunter's JDOM is one of a growing list of projects that are becoming popular outside of the Java Community Process. In the last few years, the Struts Web application framework, the Log4J logging tool and the Ant developer tool have all become widely adopted without being based on JCP standards. 'I definitely think that the JCP has broken down for some people,' said Tim O'Reilly, president of O'Reilly & Associates. He said that the success of IBM's Eclipse, which uses a graphical interface toolkit called the Standard Widget Toolkit (SWT) that has not been standardized through the JCP, has caused some partners to think twice before contributing code to Java's standards body. 'They're just saying, 'we don't necessarily get anything from it,' O'Reilly said. This is why observers are saying that Sun's new Java.net open source portal, which the company will unveil at JavaOne this Tuesday, may prove to be a strategically important move as Sun seeks to remain a vital force in the development of Java standards. O'Reilly, whose company is co-developing the network of Web sites in partnership with Sun and collaborative tools maker CollabNet, said that in Java.net, Sun is creating 'a space that they don't completely control,' in the hope of encouraging other vendors to become more involved. As the focus shifts to Java.net, however, the JCP may become less important, O'Reilly said. 'The community is to some extent routing around the JCP, and this site will to some extent accelerate the process,' he explained..." See the news item "Sun Microsystems Launches Java.net Portal for Java Technology Collaboration."

  • [June 09, 2003] "Bell Tolling for PNG Graphics Format?" By Paul Festa. In CNET News.com (June 09, 2003). "A patent underlying one of the Web's most popular graphics formats is set to expire later this month, raising the question of whether a rival, open format, created as a royalty-free alternative, will become obsolete. The situation has also rekindled debate about patents, innovation and the freedom of communication. In the United States, the patent for the Lempel-Ziv-Welch, or LZW, compression algorithm expires June 20, 2003. LZW forms the basis of the popular GIF (Graphics Interchange Format) design. LZW patent owner Unisys said it has no plans to apply for an extension to the U.S. patent, or to patents in Canada, Japan, the United Kingdom, Germany, France and Italy -- though it will enforce those latter patents for another year, until they too expire. That development means that the rival PNG (pronounced 'ping'), or Portable Network Graphics, format will soon lose its original reason for being... Last week, the World Wide Web Consortium (W3C) announced its proposed recommendation for the second edition of the PNG format, with a call for public comment through June 23, 2003. 'The original impetus for designing PNG was indeed because something needed to be done urgently,' said Chris Lilley, graphics activity lead for the W3C. 'Everyone had been using this format (GIF) and then suddenly we couldn't use it anymore. (But) you always want something a bit better and never get around to it...(and) this provided the impetus -- and PNG is better than GIF.' 'The big issue is not whether you use GIF or PNG,' said Don Marti... the big issue is whether you let a patent holder become a censor for your communications.' Conflict over patents has roiled standards organizations, with the W3C recently repudiating the use of patented technologies in its recommendations, and the Internet Engineering Task Force (IETF) reserving the right to implement them. Marti, who in 1999 organized a 'Burn All GIFs Day' protest against Unisys, compared the enforcement of software patents on communications software to the colonial-era Stamp Act. That act, passed in 1765 by the British Parliament, imposed a tax on paper and other writing materials used in the American colonies and was an impetus for the eventual revolution. 'A patent on communications, or on a format or a standard for communicating, is just like a stamp act,' said Marti. 'As soon as you decide to use a patented format to communicate, you give the patent holder a dangerous level of power over you'..." See: (1) Unisys GIF patent story; (2) "W3C Approves Patent Policy Supporting Development of Royalty-Free Web Standards"; (3) general references in "Patents and Open Standards."

  • [June 09, 2003] "Portable Network Graphics (PNG) Specification. Second Edition." ISO Reference: Information technology -- Computer graphics and image processing -- Portable Network Graphics (PNG): Functional specification. ISO/IEC 15948:2002 (E). W3C Reference: W3C Proposed Recommendation 1-October-1996, revised 20 May 2003. Second Edition edited by David Duce (Oxford Brookes University). Latest version URL: http://www.w3.org/TR/PNG. Produced by ISO/IEC JTC1 SC24 and the PNG Group as part of the Graphics Activity within the W3C Interaction Domain. "This document describes PNG (Portable Network Graphics), an extensible file format for the lossless, portable, well-compressed storage of raster images. PNG provides a patent-free replacement for GIF and can also replace many common uses of TIFF. Indexed-color, grayscale, and truecolor images are supported, plus an optional alpha channel. Sample depths range from 1 to 16 bits. PNG is designed to work well in online viewing applications, such as the World Wide Web, so it is fully streamable with a progressive display option. PNG is robust, providing both full file integrity checking and simple detection of common transmission errors. Also, PNG can store gamma and chromaticity data for improved color matching on heterogeneous platforms. This specification defines an Internet Media Type image/png. Note one of stated the design goals for this International Standard: Freedom from legal restrictions: no algorithms should be used that are not freely available. Status (excerpts): "As of this publication, the PNG Group are not aware of any royalty-bearing patents they believe to be essential to PNG... This document is the 20 May 2003 Proposed Recommendation of the PNG specification, second edition. It is also in the final stages of standardization at ISO as an International Standard, ISO/IEC 15948. The two documents have exactly identical content except for cover page and boilerplate differences as appropriate to the two organisations. This International Standard is strongly based on the W3C Recommendation 'PNG Specification Version 1.0' which was reviewed by W3C members, approved as a W3C Recommendation and published in October 1996. This second edition incorporates all known errata and clarifications. W3C Advisory Committee Representatives are invited to send formal review comments by following the instructions in the Call for Review... A complete review of the document has been done by ISO/IEC/JTC 1/SC 24 in collaboration with W3C and the PNG development group (the original authors of the PNG 1.0 Recommendation) in order to transform that Recommendation into an ISO/IEC international standard. A major design goal during this review was to avoid changes that will invalidate existing files, editors, or viewers that conform to W3C Recommendation PNG Specification Version 1.0." See: (1) archives for the W3C mailing list 'png-group'; (2) general references in "Patents and Open Standards."

  • [June 05, 2003] "Microsoft Roads All Lead to Longhorn." By Martin LaMonica. In CNET News.com (June 05, 2003). ['Microsoft this week gave customers a sneak peek at forthcoming development and management tools which are part of company's multiyear plans for the product.'] "In interviews with CNET News.com at the company's TechEd customer conference, Microsoft executives sketched out its product release plans for next year. High on the agenda were products designed to work with the next major release of the Windows desktop operating system, code-named Longhorn... Longhorn has become the centerpiece of Microsoft's future product strategy. Underscoring the importance of the operating system upgrade, CEO Steve Ballmer, in a memo sent to Microsoft employees on Wednesday, said that Longhorn was 'even bigger, perhaps, than the first generation of Windows.' Also at TechEd, Microsoft executives acknowledged the existence of a long-rumored programming language research project, called X# (pronounced X sharp). Microsoft is working on the building blocks, or 'language constructs,' of a programming language that can handle XML data more effectively than current languages, according to company executives. Microsoft's Visual Studio.Net development tool is already XML-savvy, and the software maker is betting heavily on XML-based Web services to glue together its disparate products. The X# work is still in the early development phase, and Microsoft has no immediate plans to incorporate X# into specific products, said Paul Flessner, senior vice president of Microsoft's server platform division... In a version of Visual Studio.Net due in 2004, code-named Whidbey, Microsoft will introduce some language enhancements to speed up development time, according to company executives. People will also be able to more easily develop for Microsoft's server applications, including its BizTalk integration software and its Commerce Server e-commerce software, with new tools... The enhancements in Whidbey are closely tied to an update of Microsoft's SQL Server database, code-named Yukon and due for release in the summer or fall, according to Microsoft executives... One of the goals of Yukon is to give a broader number of people access to back-end databases to do data analysis, said Stan Sorensen, director of SQL Server product management. Yukon will include software to generate reports, and Microsoft is improving SQL Server so that it presents data in a variety of formats, including XML. That will make it easier for people familiar with Microsoft's Office desktop applications to query back-end data sources, Sorensen said. Microsoft will also extend the data analysis, or 'data mining,' capabilities included in SQL Server with Yukon..."

  • [June 05, 2003] "e-Business Messaging Interchange Assessment." From the OASIS/CEFACT Joint Marketing Team. White Paper. May 23, 2003. 19 pages. Supplied by David Webber (Acting Chair, OASIS/CEFACT JMT). Evaluates: FAX, Dial-in IVR, EDI VAN (sftp), AS2 EDIINT, Email, Dial-in modem, Web Pages, SOAP, and ebMS. "Today's medium to large enterprises face a bewildering array of interchange format and mechanism choices. This document attempts to provide analysis tools that can help decipher the optimum choices for a given set of business needs. To achieve this goal, it is necessary to assess typical current interchange needs, and then compare those against the technologies available today... This report looks at appropriate factors in making such business determinations of the most common denominators between trading partners and provides a cross-tabulation according to the appropriate communications technologies. The combined result of these should be a simple and effective set of solutions that are affordable for businesses and individuals to utilize for their e-Business needs today. Into the future these parameters also provide metrics to assess emerging technologies, for example PDA and cell-phone based text-messaging exchanges as options to providing business connectivity... In addition to the base set of communications solutions selected there are others that could be considered such as RosettaNet messaging, and then non-internet based solutions such as FIX and SWIFT messaging. However, our criterion for technology inclusion was to concentrate on horizontal solutions that span industry sectors. Those considered are typically deployed on open networks (i.e., Internet or telephone dial-up connections) and are based on approved public specifications..." [source .DOC, cache]

  • [June 05, 2003] "WebDAV Ordered Collections Protocol." By Jim Whitehead (UC Santa Cruz, Dept. of Computer Science) and Julian F. Reschke (greenbytes GmbH, editor); WWW. IETF WEBDAV Working Group, Internet-Draft. Reference: 'draft-ietf-webdav-ordering-protocol-08'. May 12, 2003, expires November 10, 2003. 43 pages. " This specification extends the WebDAV Distributed Authoring Protocol to support server-side ordering of collection members. Of particular interest are orderings that are not based on property values, and so cannot be achieved using a search protocol's ordering option and cannot be maintained automatically by the server. Protocol elements are defined to let clients specify the position in the ordering of each collection member, as well as the semantics governing the ordering. This specification builds on the collection infrastructure provided by the WebDAV Distributed Authoring Protocol, adding support for the server-side ordering of collection members. There are many scenarios where it is useful to impose an ordering on a collection at the server, such as expressing a recommended access order, or a revision history order. The members of a collection might represent the pages of a book, which need to be presented in order if they are to make sense. Or an instructor might create a collection of course readings, which she wants to be displayed in the order they are to be read. Orderings may be based on property values, but this is not always the case. The resources in the collection may not have properties that can be used to support the desired ordering. Orderings based on properties can be obtained using a search protocol's ordering option, but orderings not based on properties cannot. These orderings generally need to be maintained by a human user. The ordering protocol defined here focuses on support for such human-maintained orderings. Its protocol elements allow clients to specify the position of each collection member in the collection's ordering, as well as the semantics governing the ordering. The protocol is designed to allow support to be added in the future for orderings that are maintained automatically by the server..." Note: "The IESG has received a request from the WWW Distributed Authoring and Versioning Working Group to consider WebDAV Ordered Collections Protocol as a Proposed Standard. The IESG plans to make a decision in the next few weeks, and solicits final comments on this action. Please send any comments to the iesg@ietf.org or ietf@ietf.org mailing lists by 2003-6-19." See also: (1) WWW Distributed Authoring and Versioning Working Group Charter; (2) UCI IETF WEBDAV Working Group reference page; (3) Webdav.org resoruces; (4) W3C mailing list archives for 'w3c-dist-auth'; (4) general references in "WEBDAV (Extensions for Distributed Authoring and Versioning on the World Wide Web." [cache]

  • [June 04, 2003] "Consortium Brings 'Spirit' to IP Integration." By Michael Santarini. In EETimes (June 04, 2003). "A new silicon intellectual property (IP) consortium, called Spirit (Structure for Packaging, Integrating, and Re-using IP within Tool Flows) was launched on June 2, 2003 at the Design Automation Conference (DAC)... The consortium aims to ensure IP metadata can be easily transferred between IP vendors, EDA companies, semiconductor and systems companies... Ralph von Vignau, the chairman of the consortium and director of the platform infrastructure department at Philips Semiconductors, said the consortium will focus on developing two standards. First, it will work toward developing a standard IP metadata description in XML code that aims to capture all the IP design, test and integration information and views needed to transfer and exchange IP between companies, so IP users don't have to waste time trying to characterize IP to fit it into their design flows. Second, the consortium will work to develop a standard API that will allow various companies in the IP chain to integrate Spirit IP descriptions into the various EDA vendor and semiconductor in-house tools. To date, said von Vignau, there has not been a standard format that contains all the critical IP metadata needed to integrate IP into design flows quickly... Von Vignau said that all interested parties are welcome to buy a membership for Spirit consortium. The membership will allow members to review and voice opinions about what should be in the standards. They will also get access to the format and the API after it is finished. But the founding consortium members -- ARM, Beach Solutions, Cadence Design Systems, Mentor Graphics, Royal Philips Electronics, STMicroelectronics, and Synopsys -- will have the final say in what the standard will be. The companies put this restriction on it to speed up the creation of the standard... Beach Solutions and Mentor already have proposed and donated their respective XML formats to the consortium, and a working group has already been assigned to evaluate each and perhaps meld the two together to create the final standard. The standard will encapsulate important information about IP blocks, such as clocks, signals, and test strategies -- and perhaps more -- in XML so that the broad spectrum of vendor and in-house tools can draw the IP data those tools need to work properly. The format will accompany or be embedded in IP blocks. Another working group will look at devising a versatile API so all the EDA tools can be adjusted to accept whatever XML based format the consortium devises..." See other details in the announcement: "Industry Leaders Announce New Consortium to Develop Standards for IP-Based Design. SPIRIT Consortium Focused on Increasing Efficiency Through Standards for IP Packaging and IP Tools."

  • [June 04, 2003] "Microsoft Schedules Project Update." By David Becker. In CNET News.com (June 04, 2003). "Microsoft hopes to make project management a substantial part of office routines with a new version of software that it announced Wednesday. Microsoft Office Project 2003 can be used with other applications in the upcoming version of the company's market-leading Office productivity package, making it easier for office workers and managers to keep track of large-scale projects, said Giovanni Mezgec, group product manager at Microsoft. The new application is Microsoft's entry into the developing market for project management software, which managers use to herd assignments involving multiple stages and co-workers. Typical tasks for such software include reminding workers when a certain piece is due and compiling time-line and budget estimates based on the complexity of the effort. Microsoft Project started out as a desktop-only application, but the company expanded its reach last year with a server version that gives workers more options for communicating with each other... Project 2003, like the main Office 2003 applications it will link with, will allow workers to output data in Extensible Markup Language (XML), the rapidly spreading standard for exchanging data between disparate computing systems. Outputting in XML means data in a Project database can be shared by other applications, such as software for tracking corporate expenses, Mezgec said. 'The idea is to make it easier for a broader number of people in an enterprise to work with Project-related information,' he said..." See the Project Standard 2003 Overview and the Project 2003 Frequently Asked Questions.

  • [June 04, 2003] "Microsoft's Browser Play." By Paul Festa. In CNET News.com (June 04, 2003). ['Purveyors and consumers of Web content and software, already unsettled by the pact between archrivals Microsoft and AOL Time Warner, may be in store for an even more radical upset: the end of Microsoft's standalone Internet Explorer browser.'] "Brian Countryman, IE program manager, said in a May 7 Web chat posted to Microsoft's Web site that the software maker is phasing out standalone versions of its Web browser. Since then, Microsoft has struggled to reconcile Countryman's remarks with promises that current users of the standalone version of IE will be provided with upgrades. Countryman did not return calls. A Microsoft representative pressed for clarification of Countryman's comments acknowledged that the company did not, in fact, know what it was going to do... That ambiguity leaves an array of possible outcomes, including forced upgrades to the next client version of Windows, code-named Longhorn, for users of older versions of the operating systems who want to patch security holes or other bugs in IE. 'Lack of updates for older Windows operating systems such as XP or 2000 would...require customers to upgrade to Longhorn to gain the latest browser functionality,' wrote Jupiter analyst Michael Gartenberg.. From a legal and strategic perspective, Web users now face a situation in which the dominant browser, which achieved that dominance in large part by being offered free of charge, will now only be available as part of an operating system that costs $199 for the 'Home' edition and $299 for the 'Professional' edition. Upgrades for Windows XP Home and Professional cost $99 and $199, respectively... The removal of IE as a free, downloadable software application could have a profound effect on the Web and the development of Web standards. One possibility is that its removal could benefit makers of standalone browsers, such as Norwegian software company Opera Software (which charges for one version of its browser and gives away an ad-supported one) or Netscape Communications, a unit of AOL Time Warner..."

  • [June 03, 2003] "Java Management Extensions Protocol." By Ward K. Harold (IBM Tivoli Software). Reference: 'draft-harold-jmxp-00'. IETF Network Working Group, Internet-Draft. May 8, 2003, expires November 6, 2003. 65 pages. XML Schemas are provided in Section 10: The JMXP Schema, MBEANSERVER Element Schema, MBEAN Element Schema, NOTIFICATION Element Schema, Common Element Schema. ['This document describes a protocol, composed of a set of BEEP profiles, that provides access to the attributes, operations, and notifications supported by the MBeans registered with a JMX Agent.'] "The Java Management Extensions (JMX) specification defines a Java technology based architecture for instrumenting and managing applications and devices. The JMX architecture has three layers: (1) Instrumentation, (2) Agent, (3) Distributed Services. The instrumentation layer is composed of "Managed Beans", or MBeans, that represent JMX manageable resources that management applications monitor and manipulate. The agent layer is made up of an MBeanServer and a set of management services. The MBeanServer's purpose is two-fold: it serves as a registry for MBeans, and it provides a common interface to those MBeans. The agent layer's management services, which are themselves MBeans, include monitoring, relationship management, primitive scheduling, and component loading. As of version 1.2 of the specification the content of the distributed services layer is undefined. It's stated intent, however, is to define the management interfaces and components that can operate on agents or heirarchies of agents. One essential requirement of the distributed services layer is remote access to elements of the agent and instrumentation layers. The JMX Protocol (JMXP) presented here defines a simple, language independent mechanism that satisfies the remote access requirement. This specification does not define any specific language bindings to JMXP. Such bindings are clearly important since management applications and consoles written in a variety of programming languages will need remote access to management information supplied by new JMX-instrumented applications and devices. Future documents will specify JMXP bindings for particular programming languages as the need arises..." [cache]

  • [June 03, 2003] "Web Services Are Key to Future, Says Microsoft." By Paul Krill. In ComputerWeekly (June 03, 2003). "A Microsoft official strssed the impotance of web services in building the future of IT at the Tech Ed 2003 conference in Dallas... [Paul] Flessner provided details on Microsoft's solutions based on the Windows platform, covering aspects ranging from federated identity to XML and application management. Microsoft is proposing its 'Windows Server System' as its IT platform, the foundation of which is Windows Server 2003. Flessner acknowledged the delay of the next version of the SQL Server database, codenamed Yukon. It is now scheduled for release in the second half of 2004. The delay will enable Microsoft to synchronise plans for putting the company's Common Language Runtime (CLR) in both its database and development tools, said Stan Sorensen, Microsoft director SQL Server product management... CLR is intended to make it easy to design components and applications in which objects interact across applications. Yukon is also expected to feature improvements in areas such as business intelligence and security. A private beta release of Yukon to a select group of 1,000 users is planned for the end of this month. A larger, public beta is planned for 2004. The company also unveiled a number of products including a beta release of BizTalk Server 2004, the Jupiter e-business suite, which combines the BizTalk Server integration system, Content Management Server and Commerce Server and SharePoint Portal Server Version 3. The 'Whidbey' version of the Visual Studio development tool is also planned for release and expected to feature integration with Yukon as well as improved IDE productivity and extended support for XML web Services and Office programmability. In 2006 Microsoft will release the next version of Windows Server, which will complement the SQL Server database. The 'Kodiak' version of Exchange, will support web services and runs on top of SQL Server, also is planned for 2006. Kodiak will be supported automatically within Visual Studio..."

  • [June 03, 2003] "Business Process Execution Language for Web Services Version 1.1. [BPEL4WS DIFF Version.]" June 02, 2003. This diff-marked version provides a comparison between the BPEL4WS "Version 1.1" document dated March 31, 2003 and BPEL4WS "Version 1.1" document dated May 5, 2003. Posted by Diane Jordan to the WSBPEL list on June 02, 2003. 151 pages. "This document defines a notation for specifying business process behavior based on Web Services. This notation is called Business Process Execution Language for Web Services (abbreviated to BPEL4WS in the rest of this document). Processes in BPEL4WS export and import functionality by using Web Service interfaces exclusively." Note also the updated license information supplied by IBM, BEA, and Microsoft for IPR related to BPEL4WS Version 1.1: [1] BEA: Announcement and online license. [2] IBM: Announcement and online license (source1 .DOC, source1 .DOC). [3] Microsoft: Announcement and online license; see also the announcement for the FAQ. General references in "Business Process Execution Language for Web Services (BPEL4WS)." The OASIS TC's IPR page was updated with a "Statement concerning Intellectual Property Rights submitted to OASIS 16 May 2003." [source].

  • [June 03, 2003] "TechEd Panelists Debate SOAs. Paradigm Held Up Against Mainframes." By Paul Krill. In InfoWorld (June 03, 2003). "SOAs (Service-Oriented Architectures) were pitted against legacy mainframes in a debate at TechEd here Tuesday featuring Microsoft officials responding to audience queries. Panelists participating in a session on enterprise architectures also fielded questions pertaining to whether SOAs, in which applications are treated as easily integrated components, are synonymous with Web services. One audience member questioned how services billing would be done in an SOA... Scott Woodgate, technical product manager for Microsoft BizTalk Server, said work is being done in this area in standards bodies, such as with the Business Process Execution Language (BPEL), which is under the jurisdiction of OASIS. But this issue will not be resolved in three years, he said. Woodgate added that mainframes represent a different paradigm from a different time... Maarten Mullender, Microsoft solutions architect for the .Net Platform Strategy group, said telecommunications companies and cell phone vendors would devise business models for billing services for SOAs. Another attendee said he wanted to dispel the myth that SOAs equal Web services. 'SOA is not equivalent to Web services. It is an underlying implementation of how we achieve the SOA, but we may achieve it through other means,' the attendee said..."

  • [June 03, 2003] "WS-I Basic Profile: Why Wait?" By Darryl K. Taft. In eWEEK (June 03, 2003). "While the Web Services Interoperability Organization (WS-I) is working toward a basic profile for interoperability of Web services, customers should not wait but should use tools that are available today. During a session at the Microsoft TechEd 2003 conference here, Yasser Shohoud, a program manager on the Microsoft XML Messaging team, said he does not know when the industry will see a basic profile from WS-I, but it should not matter. I don't know when the WS-I Basic Profile will be ready, but does it matter?' Shohoud said. 'I think the world should not wait for that.' Shohoud said, 'The bottom line is when you're trying to interoperate you have tools that need to work together,' but there are few tools that support the broad spectrum of interoperability issues. In the interim, while a WS-I Basic Profile is being hashed out, 'people should avoid things in the basic profile that are not widely used,' Shohoud said. 'If you are ready to build Web services today, you should not be waiting for any profile.' The WS-I Basic Profile 1.0 was approved as a draft specification in April, but has not been approved as a final document yet..." See: (1) "Basic Profile Version 1.0 Board Approval Draft", 2003/03/28; (2) general references in "Web Services Interoperability Organization (WS-I)."

  • [June 03, 2003] "Microsoft Ties Security to VeriSign, Certifications." By Thor Olavsrud and Mark Berniker. From InternetNews.com (June 03, 2003). "Microsoft moved to bolster its code-securing effort called Trustworthy Computing Initiative by announcing two security initiatives Tuesday. Microsoft and VeriSign said they would jointly develop improved solutions for authentication security, digital rights management (DRM) and other online security enhancements. Financial terms of the deal were not disclosed. The new security products from Microsoft-VeriSign are aimed at achieving improvements in existing software, while providing automated renewal of digital certificates, secure e-mail and digital signatures. The alliance also plans to help improve network security with reliable access to wireless LANs or virtual private networks. The two partners also said they plan to help customers embed PKI (public key infrastructure) security into desktop and networked applications. Microsoft also announced the availability of a new security certification program for system administrators and systems engineers: MCSA: Security and MCSE: Security. These programs will give IT professionals training to improve enterprise security... The pact is expected to improve upon existing security use of digital signatures for Microsoft's Windows Server 2003. Digital signatures provide some authentication security, but with the recent security problems associated with Microsoft's Passport product, the company is moving to improve security software within its products. The deal aims to provide improved online security, especially for remote access. The two companies will build the security solutions into not only Microsoft's Windows Server 2003, but also VeriSign's Managed PKI (public key infrastructure) Services..." See: (1) details in the announcement: "Microsoft and VeriSign Teaming Up to Provide Next-Generation Security Solutions for Enterprise Customers. Windows Server 2003 and VeriSign Trust Network to Ease Certificate Deployment and Management for a Range of PKI Services."; (2) "XML Digital Signature (Signed XML - IETF/W3C)"; (3) general references in "XML and Digital Rights Management (DRM)."

  • [June 03, 2003] "Parsing, Indexing, and Searching XML with Digester and Lucene. These Open Source Projects Can Ease Your XML-Handling Tasks." By Otis Gospodnetic (Software Engineer, Wireless Generation, Inc). From IBM developerWorks, Java technology. June 3, 2003. ['Java developers can use the SAX interface to parse XML documents, but this process is rather complex. Digester and Lucene, two open source projects from the Apache Foundation, cut down your development time for projects in which you manipulate XML. Lucene developer Otis Gospodnetic shows you how it's done, with example code that you can compile and run.'] " If you've ever wanted to parse XML documents but have found SAX just a little difficult, this article is for you. In this article, we examine how to use two open source tools from the Apache Jakarta project, Commons Digester and Lucene, to handle the parsing, indexing, and searching of XML documents. Digester parses the XML data, and Lucene handles indexing and searching. You'll first see how to use each tool on its own and then how to use them together, with sample code that you can compile and run... Commons Digester is a sub-project of the Commons project, which is one of the initiatives developed by the community of developers who create open source software under the Apache Jakarta umbrella. Digester offers a simple and high-level interface for the mapping of XML documents to Java objects. When Digester finds developer-defined patterns in XML, it will take developer-specified actions. Digester requires a few additional Java libraries, including an XML parser compatible with either SAX 2.0 or JAXP 1.1... Lucene is another Apache Jakarta project. Like Digester, it is a Java library and not a stand-alone application. Behind its simple indexing and search interface hides an elegant piece of software capable of handling many documents. In this article, we use Digester to parse a simple XML file, then illustrate how Lucene creates indices. Then we marry the two tools to create a Lucene-generated index from our sample XML document, and finally use Lucene classes to search through that index... The approach described in this article should satisfy the simple XML indexing and searching needs of most developers. You should also take a look at the Sandbox subproject of Lucene, which includes examples of indexing of XML documents using SAX 2 and DOM parsers" Article also in PDF format.

  • [June 03, 2003] "Intelligent Architectures for Service-Oriented Solutions. Moving Toward a Dramatic Reduction in Cost and Time." By Abdul Kayam and Steve Bailey (hyfinity). In Web Services Journal Volume 3, Issue 6 (June 2003), pages 36-42 (with five figures). "This article will define the essential characteristics for software development for Web services. None of the characteristics presented are new; in fact some of them date back more than 20 years. The fundamental difference is the 'morphing' of these elements into a revolutionary software development approach for service-oriented solutions... The three main considerations of any software development approach are the communication mechanism, the business logic, and the representation of information and data. The ubiquitous nature of the Internet and its related specifications make it the obvious choice for a communication mechanism. In order to make effective use of these standards (which include SOAP, WSDL, XML, and HTTP), they should be incorporated directly into the development process. By using XML over HTTP as a standard communication mechanism, business can take advantage of peer-to-peer architecture. This peer-to-peer approach requires autonomous, intelligent software agents, and these agents will in turn need to be able to react to the content of the incoming messages to be able to function efficiently. The most effective way for the agents to do this would be to make use of XML-based rules using open standards such as XPath. By making these business rules declarative, we also gain business logic visibility. If we also ensure that the rules are directly activated by the incoming XML document content, we also gain a human- and machine-readable representation of information and data. The authors examine what they believe to be the key characteristics required for a service-oriented development approach, and highlighted why this approach delivers a dramatic reduction of the cost and timescales for business integration. E-business projects should be driven by business needs and not by the coding capability of technical specialists. In fact, as more and more software elements are Web services- enabled, the cost of business integration will decrease further. Traditional software development and middleware approaches adopted by the majority today simply cannot deliver effective, intelligent service-oriented architectures. A Web service-oriented architecture must be document-based, loosely coupled collaborations of autonomous intelligent software agents. A single development and runtime platform that supports all the above characteristics in a cohesive architecture is inherently more agile and productive. As business needs change and grow, this platform must also be able to evolve with them, making the software development approach outlined in this article the only suitable choice to take software development into the future..." [alt URL]

  • [June 03, 2003] "BEA WebLogic Workshop 8.1. J2EE-Based Web Services Development Made Easy" By Joseph A. Mitchko. In Web Services Journal Volume 3, Issue 6 (June 2003), page 34. ['Last year, BEA introduced WebLogic Workshop, a revolutionary product based on declarative annotations that took away most of the pain and aggravation of developing J2EE-based Web services on the WebLogic Application Server platform. Not being satisfied with just Web services, BEA extended this technology with Workshop 8.1 to include Web applications, portals, and other J2EE integration-based applications.'] "For development of loosely coupled applications that can maintain their public contract while underlying data structures change, WebLogic Workshop 8.1 now includes support for XML Schema and XQuery Mapping. Based on the XQuery XML standard, the visual mapping tool allows you to map XML elements to Java data elements by simply performing point-and-click operations. In addition to straight one-to-one mapping, you can also use a number of built-in XQuery functions such as 'concat,' allowing you to combine various fields into one. All of the hard work of handling the complex data transformations is performed automatically. In addition to XQuery, Workshop 8.1 provides support for XMLBeans, a strongly typed Java object interface for XML data that allows a developer to manipulate raw XML data using the productivity and flexibility benefits of the Java language... For Web services, WebLogic Workshop 8.1 now supports both the RPC and document-literal style of SOAP requests, making it easy to integrate with .NET-based Web services... WebLogic Workshop 8.1 includes a number of new Java Controls to help you connect to various IT assets, including FTP, e-mail, Tuxedo, Portal Server, Integration Server, and more. Remember that as a developer, interacting with a Java Control is the same for all types of back-end services. All you need to do is set various property settings and set up event handlers; the control itself handles all the hard stuff..." See the BEA WebLogic Workshop 8.1 website for details and download. [alt URL]

  • [June 03, 2003] "E-services: Fulfilling the Web Services Promise." By Heather Kreger (Web Services Lead Architect, IBM Emerging Technologies, IBM Software Solutions Division, Research Triangle Park, NC, USA). In Communications of the ACM (CACM) Volume 46, Number 6 (June 2003), pages 29-34. With 12 references and sidebar "What Are Web Services," by Christopher Ferris and Joel Farrell. "This article reviews the set of technologies necessary for Web services, looks at where they are being specified and standardized, and identifies the current technical challenges the Web services community is attacking. It concludes with recommendations for the use of Web services today and suggestions that prepare for Web services tomorrow. We can represent the technologies that must be standardized in order to implement the Web services-oriented architecture in the conceptual Web services stack... the three sections of the stack are corollaries to the roles in the Web services-oriented architecture: interact, description, and discovery agency. At the base is the wire section that captures the technologies required to transport messages from a service requester over the network to the service provider. The transport layer addresses network connectivity via the ubiquitous TCP-IP base. The packaging layer defines how the payload is encoded in the message to be transported. The extensions layer defines the extensible set of features expressed as headers on the message. These layers must support the XML information set (infoset). SOAP and HTTP are the most widely supported standards for these layers, but other bindings are possible. The technology choice for this section will determine the potential client base for a service. The next layer to be standardized is the description layer. All type descriptions are specified and expressed using the XML Schema language. The interface and implementation description define the mechanics of interacting with a Web service, which includes the operations and messages supported, how to serialize those messages onto the wire, and where to send the messages. A policy description layer will be used to describe service-specific information beyond mechanics, such as owning business, taxonomy, security requirements, timeouts, costs, and quality of service parameters. The presentation layer describes how a user interface is generated for this service. These four layers fully describe a service. The next two layers describe relationships and interactions between services. Related services may be expressed in the composition layer. This layer includes groupings, containment, dependencies, and parent-child relationships. The orchestration layer encompasses ordering of operations, choreography, workflows, and business processes. The final two layers describe agreements between the service requestor and provider. The service-level agreement layer defines the specific performance, usage, costs, metrics, and thresholds to which a service is expected to adhere. The business-level agreement layer describes a contractual agreement between the two business partners who will be transacting business using Web services. Discovery agencies, the next section of the stack, encompass the technologies that enable service descriptions to be published, support the discovery of service descriptions, and provide inspection of sites for the descriptions of hosted services. Publish is very loosely defined as any means to make a service description available to a requester -- from email to registries. Discovery is defined just as loosely, ranging from accessing a description in a file system to sophisticated searches of service registries at either development or runtime... The vision of fully dynamic, ad hoc business partnerships is not yet viable for a number of reasons. First, the infrastructure standards outlined here must be finished, productized, and widely deployed. Second, industry standard Web services interfaces, or portTypes, must be defined for the various aspects of business-to-business relationships. Finally, XML languages that can describe legally binding business and service-level agreements must be defined and standardized. This is more than a technical challenge; it may be a cultural challenge as well because business relationships often span legal, cultural, language, and national boundaries. Of the infrastructure standards that must be completed, the most important set is security. Currently four of the six specifications in the Web Services Security Roadmap are available along with sample implementations... current products and implementations may need to evolve as the industry comes to agreement on the final standards. Even so, it is now safe to use Web services for integration projects, since most development and middleware products will shield customer implementations from specification changes..." [sub URL]

  • [June 03, 2003] "E-services. The Web Services Debate: J2EE vs. .NET." By Joseph Williams (Master Architect, Sun Professional Services, Fort Collins, CO, USA). In Communications of the ACM (CACM) Volume 46, Number 6 (June 2003), pages 58-63. "NET is Microsoft's attempt to develop a comprehensive execution environment that includes everything a company would need to implement Web services on Windows. In many ways .NET is Microsoft's Windows-centric response to J2EE. Much of the hype around .NET is just rebranding of existing products. The current centerpiece of .NET is the .NET development platform. A huge product (the zipped file is 1.8GB and expands to more than 24,000 files), Visual Studio .NET is an update to Visual Studio 6 that directly supports the new languages Microsoft is promoting for developing Web services: C#, which is a new language that looks a lot like Java; J#, an even closer implementation of Java syntax and libraries; and VB.NET, a substantial update to Visual Basic... The bottom line on .NET is that it only runs on Windows servers. Thus, it is completely a Microsoft-centric technology. Yes, it does support dozens of different languages so it is a very rich development platform (of course, each language module must be individually licensed at considerable cost). However, the deployment platform is pure Microsoft, meaning the technology has limited reach in the kinds of heterogeneous environments that persist at most enterprises... The Web services promise of loosely coupled components existing in a build once/use-many environment finds its natural expression in J2EE. Although J2EE is Java-centric it does have the ability to run on any operating system; thus it represents the flip side of .NET. J2EE is one language, many platforms. .NET is many languages, one platform (although, arguably, that actually isn't true since most of the powerful tools for .NET are designed specifically for C#; thus, it really is one full-featured language on one platform). The bottom line on J2EE is that on a feature-by-feature basis it compares very favorably with .NET... Moreover, all of the specifications that define the J2EE platform are published and reviewed publicly, and numerous vendors offer compliant products and development environments. This simply isn't true with .NET. J2EE is more mature than .NET and even though XML is not yet native (it will be in JDK 1.5) much of the vision implemented by Web services was presaged in J2EE... Both Sun and Microsoft are articulating clear visions around Web services. J2EE and .NET are different tools embodying different strategies for implementing Web services. It is pointless to argue currently that one tool is superior to the other. Instead, the focus needs to be on ensuring the right tool is deployed; for most enterprise shops J2EE offers more flexibility, greater robustness, and a proven pedigree..." [sub URL]

  • [June 03, 2003] "E-services. The Web Services Debate: .NET vs. J2EE." By Gerry Miller (Chief Technology Officer, Microsoft U.S. Central Region). In Communications of the ACM (CACM) Volume 46, Number 6 (June 2003), pages 64-67. "At its very root, a Web service is nothing other than a server that listens for and replies with SOAP, generally via HTTP. In practice, a Web service will support WSDL to describe its interfaces, and should also be listed in a UDDI registry. Of course, at this point Web services are deceptively simple. So far the technology industry has not coalesced around Web services standards for security, transactions, state management, and workflow. However, nearly every member of the technology community has a vested interest in developing these standards, so they should come to fruition within a few years. Most Web services development is being accomplished today using either Microsoft .NET or Sun Microsystems' J2EE specification. This is interesting, considering the undisputable fact that J2EE has no support for Web services. The J2EE specification will not contain any native support for Web services until J2EE 1.4, which Sun delayed recently by six months, so commercial implementations are unlikely before 2004. Developing Web services with J2EE today means using either extensions to J2EE that are not part of the specification, or doing lots of XML parsing in code... Interestingly enough, despite a common perception to the contrary, J2EE is in fact not an open specification. While Sun contends that J2EE is beholden to the Java Community Process, the JCP really only defines a process for community members to suggest updates and changes to the specification. Final specifications can only be approved by Sun. Sun has more than once publicly committed to turn the technology over to a standards body, and each time has reneged on this promise. A core part of Microsoft .NET, on the other hand, is comprised of the Common Language Interface (CLI), of which Microsoft officially relinquished control to a true standards body, the European Computer Manufacturing Association (ECMA). Web services are clearly critical for the next wave of enterprise computing -- integrating disparate business systems for more effective use of information. Companies like Microsoft and Sun Microsystems should be commended for their clear commitment to work together toward a common industry standard for all our customers' benefit. There should always be more than one choice of development environment for customers, because there will never be one solution always appropriate for everyone in every situation. The world is big enough for both Microsoft .NET and J2EE. Still, for the reasons outlined herein, Microsoft .NET will generally be a better choice for most companies in most situations..." [sub URL]

  • [June 03, 2003] "Blue Titan Knits Fabric for Web Services." By David Rubinstein. In Software Development Times (June 03, 2003). "Blue Titan Software Inc. has released version 2.0 of its Network Director for Web services, a set of distributed software components and network fabric services for managing Web services. The new version includes a set of so-called fabric services that allows policy-based management of Web services. Those services, which provide access control, and version control and prioritization, are themselves exposed as Web services for use in what Blue Titan (www.bluetitan.com) is calling adaptive policy execution. Those policies can implement workflow rules that can change in real time based on application activity, said Sam Boonin, vice president of marketing. Also new to version 2.0 are a publish/subscribe messaging service based on SOAP. Version 2.0 also supports the WS-Policy, WS-Security and WS-Reliable Messaging specs, Boonin said. 'We provide gear for people to create their own Web services network,' Boonin said. 'Any Web service can be registered in Blue Titan, and we essentially become the WSDL of record. The end point becomes our control point' from which messaging and routing occur..." See details in the Blue Titan announcement "Blue Titan Network Director 2.0 Enables Pragmatic Adoption of Enterprise Service-Oriented Architectures. New Generation of Web Services Networking Software Delivers Adaptive, Event-Driven Control for Enterprise SOAs."

  • [June 03, 2003] "Volvo, Enigma Sign Service Life-Cycle Management Deal. The Carmaker Will Use Enigma's 3C Platform." By Linda Rosencrance. In Computerworld (June 02, 2003). "Volvo Car Corporation has signed a multimillion-deal to use software from Enigma Inc. to simplify relationships with its 18,000 dealers and independent repair shops across the globe, the companies said today. Volvo, a wholly owned subsidiary of Ford Motor Corp., will use Burlington, Mass.-based Enigma's 3C Platform (for content, commerce and collaboration) to build an application that ties together Volvo's existing XML-based service manuals, electronic parts catalogs and service bulletins, as well as its diagnostic and software downloads for onboard automotive control systems, the companies said in a statement. The application will be deployed globally in 17 languages via the Web and CD/DVD. With the new system in place, technicians and mechanics will be able to more quickly access the information they need to repair any Volvo car, the companies said. In addition, the application will be tied into other dealer management systems as well as other dealer-based back-office systems to streamline parts ordering, inventory and other financial processes, according to the companies..." See: (1) Enigma 3C website; (2) details in the announcement: "Enigma Has Been Selected By Volvo Cars as the Foundation For Its Global, Integrated Aftersales Support Solution. Standards-Based Technology Delivers Integrated Service and Parts Information in Real Time to Dealers and Independent Service Technicians WorldWide."

  • [June 02, 2003] "HP Turns to Jabber for Enterprise IM." By John K. Waters. In Application Development Trends (June 02, 2003). "Instant messaging (IM) is fast emerging as a useful and productivity-enhancing enterprise technology, and many businesses have begun to embrace it in a serious way. In fact, according to the Gartner Group, instant messaging is proving to be a real driver of enterprise communications as companies seek to integrate IM and so-called presence technologies into their enterprise applications. Gartner analyst Maurene Caplan Grey believes that vendor alliances in the IM space are fueling the current drive toward adoption of a common IM and presence protocol... One example of this trend can be seen in the alliance between industry heavyweight Hewlett-Packard (HP) and Jabber, a Denver-based IM and presence technology developer. The two companies announced last week that HP will resell a version of Jabber's enterprise instant-messaging framework that the two companies developed jointly for the HP-UX platform, as well as for Microsoft Windows 2000 and Windows 2003 servers. According to the terms of the agreement, the two companies will sell the Jabber framework jointly on these platforms and on the existing Linux-based platform to HP's worldwide customer base beginning in Q2 of this year. Financial details of the agreement were not available at press time. The Jabber Communications Platform is an enterprise/carrier-grade IM and presence solution. ('Presence' refers to the ability of an application to tell when a user is online and available to receive a message.) The Jabber commercial product and its open-source counterpart, Jabber.org, are based on the eXtensible Messaging and Presence Protocol (XMPP). XMPP is an XML-based data-transport technology that its proponents contend is better suited to handling IM and presence than a signaling technology. XMPP can be extended across disparate applications and systems because of its XML base. XMPP is being developed by the Internet Engineering Task Force (IETF)... XMPP's competitor, Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (Simple), is also under construction by the IETF. Simple is a set of extensions to the established SIP protocol that initiate, set up and manage a range of media sessions, including voice and video. Simple extensions define SIP signaling methods to handle the transport of data and presence. Microsoft has said that it prefers Simple largely because of its capacity to unify voice, video and data messaging. Jabber recently formed an alliance with another industry heavyweight, Intel. Under the terms of that agreement, the Santa Clara, Calif.-based chipmaker will invest in Jabber and help the company to develop wireless products. Jabber plans to release an SMS Gateway for carriers and the enterprise in Q3 of this year..." See also: (1) the announcement, "Jabber, Inc. Announces Global Agreement with HP. HP Becomes Development Partner and Global Reseller of Open-Architected Jabber, Inc. Commercial Product."; (2) "Jabber XML Protocol"; (4) "Extensible Messaging and Presence Protocol (XMPP)"; (4) "IETF Publishes Internet Drafts for XML Configuration Access Protocol (XCAP)."

  • [June 02, 2003] "MS Homes in On Jupiter With BizTalk Server Beta. Test Version Seen as First Phase of Jupiter Project." By Joris Evers (IDG News Service). In InfoWorld (June 02, 2003). "Microsoft on Monday announced the availability of the first beta version of BizTalk Server 2004, marking the first phase in its 'Jupiter' project to provide software for integrating business applications. BizTalk Server 2004 will be the foundation for Jupiter, a project announced last year that will unify BizTalk with two of Microsoft's other 'E-Business Server' products, Commerce Server and Content Management Server, said David Wascha, group product manager for E-Business Servers at Microsoft. The BizTalk Server 2004 beta was announced at Microsoft's TechEd conference in Dallas. BizTalk Server 2004, the upgrade to BizTalk Server 2002, is designed to integrate disparate applications in an enterprise. It will compete with products from vendors including IBM, webMethods, Tibco Software, SeeBeyond Technology, and BEA Systems. Research firm Gartner estimates the integration software market was worth around $1.7 billion in license revenue last year and will grow between 6 percent and 8 percent in 2003. Microsoft claims to have about 2,300 customers for BizTalk Server worldwide. One of Microsoft's goals with Jupiter is to reduce the complexity of business integration software, something that both Microsoft and its rivals have been lax at in the past, according to Wascha. Among the enhancements in BizTalk Server 2004 will be support for the BPEL4WS (Business Process Execution Language for Web Services) and a developers' tool that will work inside Microsoft's overarching developer environment, Visual Studio .Net, Wascha said... Also new in BizTalk Server 2004 is integration with InfoPath and Excel, data gathering and spreadsheet products, respectively, that are part of Microsoft's Office System. Users will be able to get easier access to back-end data sources from within those applications, according to Wascha. Other features in BizTalk Server 2004 include single sign-on, a workflow engine and a business rules engine, Wascha said. These features will be used by the other pieces of Jupiter that will be delivered later, he said. Microsoft is making great advancements in the integration software market, a market it only entered a few years ago and was late to get to, said David McCoy, a Gartner vice president and fellow... BizTalk Server, Commerce Server and Content Management Server today are sold as separate products. With BizTalk Server 2004 becoming the foundation for the future versions of Commerce Server and Content Management Server, the Jupiter integration project raises the issue of whether users will be forced to buy more software than they require. "How we will ultimately package the products has yet to be decided, but you will be able to buy content management software from Microsoft and you won't have to buy more software than you need," Wascha said. Gartner's McCoy believes Microsoft is aware of the issue and that customers won't be forced to buy unneeded software to get the product they want..." See details in the announcement: "Microsoft Delivers Beta for First Phase of 'Jupiter,' Dubbed BizTalk Server 2004. BizTalk Server 2004 Delivers Jupiter's Core Foundation for E-Business."

  • [June 02, 2003] "Starting a Healthy Dialogue. CDC Lays Groundwork for XML-Based National Health Data Exchange." By Ron Miller. In Federal Computer Week (June 02, 2003). ['Using Extensible Markup Language, Health Level 7 and industry-standard development tools, states can transmit data to CDC from many databases and computer systems in a common messaging format. CDC has developed a base system (general XML schema) for states to use, or they can develop their own systems using grant money supplied by CDC.'] "Long before the Sept. 11, 2001, terrorist attacks, the Centers for Disease Control and Prevention recognized the need for a single, integrated disease reporting system -- for naturally occurring outbreaks or bioterrorism incidents. Bringing together health data nationwide required a common data transmission system that would allow the different entities to share data from a variety of databases and computer systems. That system required a common language, and CDC officials determined early on that XML provided the best avenue to achieve smooth data exchange. Using XML, CDC's integrated system -- known as the National Electronic Disease Surveillance System (NEDSS) -- will provide a way for states to enter and transmit disease data to the federal agency using industry-standard technologies. 'It is a major collaborative initiative involving public and private partners intended to streamline movement of public health data by taking advantage of advances in information technology,' said Daniel Pollock, a medical epidemiologist at CDC. Efforts thus far on NEDSS illustrate the critical groundwork required to adapt the malleable XML for a specific project... With NEDSS, the 50 states, seven municipalities and CDC needed not only to come to an agreement around a common set of data tags, but also to develop a way to allow each computer within the system to understand and interpret those tags. Enter the XML schema, a document that defines the set of tags and checks to be sure they were applied correctly in a process known as validation. 'The XML schema expresses the shared vocabulary that allows the machine to carry out rules made by people,' CSC's Kauflin said. 'So, we can use the schema to define the structure, content and semantics within an XML document.' What's more, the project required a standard format to help define its data. In this case, project leaders chose the Health Level 7 (HL7) standard, which has been used in the medical industry for many years to allow systems within the same medical organization to communicate with one another. Basically, HL7 defines the semantics of the message being communicated, said Mead Walker, an independent consultant working with CDC on the implementation of HL7 for NEDSS. In this case, the semantics provide a way to express various types of health care-related data. XML provides the syntax for the message, or the format for how the information will be organized. In the upcoming release of HL7 Version 3, project leaders are moving to expand the format to allow the transfer of information, not only among different systems within the same institution, but across systems in a variety of locations, which is a NEDSS requirement... Although NEDSS is still in development, its core building blocks are complete, and CDC and state health departments continue to work together to refine the process... To date, $37.5 million in federal money has been funded/obligated for state grants to support the National Electronic Disease Surveillance System (NEDSS) initiative. Approximately $6 million in additional federal funding is scheduled to be awarded to states in fiscal 2003 for a total of $43.5 million from fiscal 2000 to 2003..." Related: (1) Health Level Seven XML Patient Record Architecture; (2) Clinical Data Interchange Standards Consortium; (3) Electronic Common Technical Document (eCTD) for Pharmaceuticals.

  • [June 02, 2003] "Federated Identity Management Addresses E-Business Challenges. Industry Commentary." By John Worrall (RSA Security) and Jason Rouault (Hewlett-Packard); RSA and HP are founding members of the Liberty Alliance Project. In Web Services Journal Volume 3, Issue 6 (June 2003), page 58. "A single organization cannot effectively manage or control an e-business initiative from beginning to end, especially when multiple partners are involved. Even within the enterprise, different business units often manage distinct sets of users and resources. That's why organizations are turning to federated identity management to address their e-business challenges. In a federated environment, a user logs on through his identity provider and then leverages that authentication to easily access resources in external domains. Federated identity standards form an abstraction layer over local identity and security environments of diverse domains. This abstraction layer provides for interoperability between disparate security systems inside and across domains, enabling true federation. Each domain maps to the agreed-upon policies without divulging sensitive user information. This trust is the foundation of any federated environment, and the organizations that work together within a domain are a circle of trust. A circle of trust connotes that both a business relationship and technical infrastructure are in place to assure secure access. The Liberty Alliance is developing and delivering the first open architecture and specifications to enable federated identity management. At its core is the Identity Federation Framework (ID-FF), which facilitates identity federation and management through features such as identity/account linkage, single sign-on, and session management. ID-FF is fundamental to underpinning accountability in business relationships and Web services; providing customization to user experience; protecting privacy; and allowing adherence to regulatory controls. The Liberty Alliance is also specifying an Identity Web Services Framework (ID-WSF) that will utilize the ID-FF. This framework introduces a Web services-based identity service infrastructure that enables users to manage the sharing of their personal information across identity and service providers as well as the use of personalized services. For example, a user may authorize a service provider to access their shipping address while processing a transaction. Built on top of the ID-WSF is a collection of interoperable identity services, the Identity Services Interface Specifications (ID-SIS). The ID-SIS might include services such as registration, contact book, calendar, geo-location, presence, or alerts. Through Liberty protocols and a standard set of attribute fields and expected values, organizations will have a common language to speak to each other and offer interoperable services. The services defined in the ID-SIS are designed to be built on top of Web services standards, meaning they are accessible via SOAP over HTTP calls, defined by WSDL descriptions, and use agreed-upon schemas... The Liberty Alliance unites more than 160 firms representing more than 1 billion consumers. Organizations like this will continue to strive to achieve digital identity standards that will facilitate e-business processes around the globe..." General references in "Liberty Alliance Specifications for Federated Network Identification and Authorization." [alt URL]

  • [June 02, 2003] "XML Variant Consolidates Business Reporting." By John S. McCright. In eWEEK (June 02, 2003). "The XBRL data format could help companies more quickly create reports on business conditions and thus comply with the Sarbanes-Oxley Act reporting requirements, say proponents of the format. Extensible Business Reporting Language provides a means for IT departments to aggregate and consolidate financial information that is accurate, timely and reliable and to present that information to the accountable executives. It uses XML metadata tags to describe financial information. Applications that include XBRL tags, specifications and taxonomies are being used in financial statements, general ledger transactions, regulatory filings and other business reports. An XBRL taxonomy, which is akin to an XML schema, describes a standard way to report business information... XBRL eliminates what Willis calls the West Palm Beach effect -- hanging chads and errors in accounting. New York-based XBRL International is scheduled this month to review the draft Version 2.1 of XBRL, which is focused on increased interoperability with non-XBRL-compliant systems. The specification has been adopted by 24 regulators -- the Federal Deposit Insurance Corp., in the United States; Inland Revenue, in the United Kingdom; and Deutsche Bank AG, in Germany, for example -- as their mandatory electronic filing formats. XBRL is making headway in packaged applications as well. Microsoft Corp. plans to release an XBRL plug-in for the Excel 2003 spreadsheet... Other experts said top management does not always understand the difficulties IT faces when called on to aggregate all the financial data to present a unified version of what's going on in a company. Sarbanes-Oxley provides an instance where IT can bend the executives' ears..." See general references in "Extensible Business Reporting Language (XBRL)."

Earlier XML Articles


Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/xmlPapers200306.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org