This issue of XML Daily Newslink is sponsored by:
Sun Microsystems, Inc. http://sun.com
Microsoft Submits Two Licenses for OSI Approval
Peter Galli, eWEEK
Microsoft has submitted two of its Shared Source licenses to the Open Source Initiative for review and approval as open-source licenses: the Microsoft Permissive and Microsoft Community licenses. The move comes more than two weeks after Bill Hilf, Microsoft's general manager of platform strategy, used his keynote address at the annual O'Reilly Open Source Conference in Portland, Ore., to announce its licensing plans. Asked in a recent interview if Microsoft had taken a comprehensive look at all of the OSI-approved licenses already in existence to determine if any of them could meet its needs, rather than creating these new ones, Hilf said the company's license attorneys had taken a deep look at all of those, including the Mozilla Public License and the Apache License, "but there were a couple of things in each that made them challenging for us to work with." About six months ago, Hilf and his team decided the time was right for a number of reasons, including that there was a body of work already represented by the licenses, and they started looking at what was needed to submit the licenses to the OSI and what this would mean. Microsoft then contacted the OSI board to let them know it was considering submitting its licenses to them, and was assured that it would be treated objectively and fairly and that the license would be evaluated on its own value and merits, exactly the same as any other license submission. But the issue of license proliferation has been a controversial one in the community, and companies like Sun Microsystems have been criticized for creating new "vanity" licenses rather than using existing OSI-approved ones. With regard to criticisms that these licenses, if approved by the OSI, would be yet more "vanity" licenses, Hilf said that was not the case as there were nearly 600 software projects under its Shared Source licenses, and only 170 of those are Microsoft projects, including the Dynamic Language Runtime, the Aids Vaccine Research tool from Microsoft Research and a video game plug-in. See: (1) Microsoft-Community-License, and (2) Microsoft-Permissive-License.
See also: The Open Source Definition
W3C Working Drafts: Web Services Policy Primer and Guidelines for Authors
A. Vedamuthu, D. Orchard, F. Hirsch (eds), W3C Technical Reports
W3C's Web Services Policy Working Group has released two updated Working Drafts. The Primer introduces the policy language and policy attachment mechanisms. It provides an introductory description of the Web Services Policy language and should be read alongside the formal descriptions contained in the WS-Policy and WS-PolicyAttachment specifications. This document is written: (1) for policy expression authors who need to understand the syntax of the language and understand how to build consistent policy expressions, (2) for policy implementers whose software modules read and write policy expressions and, (3) for policy assertion authors who need to know the features of the language and understand the requirements for describing policy assertions. The "Guidelines for Policy Assertion Authors" is written to provide best practices and patterns to follow as well as illustrate the care needed in using WS-Policy to achieve the best possible results for interoperability. It is a complementary guide to using the specifications. The WS-Policy specification defines a policy to be a collection of policy alternatives. Each policy alternative a collection of policy assertions. The Web Services Policy 1.5 - Framework provides a flexible framework to represent consistent combinations of behaviors from a variety of domains. A policy assertion is a machine readable metadata expression that identifies behaviors required for Web services interactions. WS-Policy Assertions communicate the requirements and capabilities of a web service by adhering to the specification, WS-Policy Framework. To enable interoperability of web services different sets of WS-Policy Assertions need to be defined by different communities based upon domain-specific requirements of the web service.
See also: Guidelines for Policy Assertion Authors
Grid Computing: Migrating from WSRF to WSRT
David A. Cafaro, IBM developerWorks
With the growth of networking and the idea of distributed computing load, the concept and implementation of grid computing has matured. Though the definition of "grid computing" can mean different things to different groups, it generally refers to the interoperability and communication of heterogeneous resources over a network in a structured or managed system. One way to implement grid infrastructure is through the use of Web-based services. WS-Resource Framework (WSRF) is one specification toward the implementation of Web-based grid services. Coupled with the WS-Notification (WSN) specifications, these provided an excellent starting point for Web services-based grid computing. At around the same time, competing standards, such as WS-Transfer, were also being developed and deployed. Since these were competing standards and did not communicate with each other, the idea of networking separately developed grid resources was being hampered. In March 2005, the developers of WS-Transfer and WSRF announced a plan for a new specification: WS-ResourceTransfer (WS-RT). WS-RT was an expansion and enhancement of WS-Transfer that added the functionality WSRF provided that the original WS-Transfer did not. Specifically, WS-RT added functionality from WS-ResourceProperties and Web Services ResourceLifetime (WS-ResourceLifetime)—two components of WSRF that had certain detailed control mechanisms that were lacking in WS-Transfer. The article looks at two possible strategies for dealing with migration from the WSRF to WS-RT. These two strategies, XSLT transforms and delegation, provide a method to deal with most possible migration situations. Though neither will solve every situation, by studying your individual needs and future plans, one of these—or a combination of them—should set you on the path to a smooth transition to WS-RT-based Web services.
Office Formats Fail to Communicate
Tiffany Maleshefski, eWEEK
The first question a company asks when presented with a Microsoft Office alternative is how well the software supports Microsoft's de facto standard file formats. Based on eWEEK Labs' testing experience with productivity applications based on the Open Document Format standard (the most prominent of which are Sun's StarOffice and its open-source sibling, OpenOffice), document fidelity consistently falls short of 100 percent, and that's not good enough for most companies and organizations. According to Gary Edwards, founding president of the OpenDocument Foundation, OpenOffice and other Open Document-based applications can do a better job interoperating, if only the vendors that steer the format would allow them to. Edwards' foundation, a nonprofit that funds individuals in pursuit of developing software standards, had been involved with the Massachusetts Open Document Format pilot study, and had participated in the OASIS Open Document Technical Committee from its early days. Sun, however, argues that it's better to rely on the experience and knowledge of ODF vendors to resolve inconsistencies between the Microsoft and Open Document Formats than to fall back on proprietary extensions. "The OpenDocument Foundation has a different understanding of how interoperability should be achieved," said Erwin Tenhumberg, community development and marketing manager for Sun Microsystems. Sun believes the most viable way to continue developing the Open Document Format, according to Tenhumberg, is to keep building from the ground up. Standardizing random proprietary extensions will make the process messy and complicate matters, since most of the applications wouldn't be able to use the information, said Tenhumberg, who co-chairs the OASIS Open Document Format Adoption Committee. The way Tenhumberg sees it, there are two dominating philosophical approaches to adding features to the Open Document Format. The first is to seek out supporting vendors, such as IBM, Sun, Novell, Adobe, Corel, among others, to implement the required features fostering interoperability among applications. The second, and the one Edwards subscribes to, is to allow the inclusion of proprietary extensions to implement the needed feature sets. This idea, Tenhumberg said, "might be faster to get through, but by doing so won't achieve interoperability because it's not based on something ODF-specific—but on proprietary extensions."
NIST Endorses Microsoft's Open XML in Upcoming Vote
Martin LaMonica, CNet News Blog
The National Institute of Standards and Technology (NIST) is backing Microsoft's effort to certify Office Open XML as an international standard. The U.S. standards body said on Friday [2007-08-10] that it has voted to conditionally approve Office Open XML (OOXML) pending some technical concerns in an upcoming standards approval vote. NIST is part of the committee that will establish the United States' position in a September 3 vote at the International Organization for Standardization (ISO). Microsoft is seeking ISO standardization as a way to appeal to government customers concerned over long-term archiving of digital documents. OpenDocument, or ODF, is another standard document format already approved by ISO. Advocates of ODF argue that a single standard is preferable while Microsoft executives argue that multiple standards provide more customer choice. "NIST believes that ODF and OOXML can co-exist as international standards," NIST director William Jeffrey, said in a statement. "NIST fully supports technology-neutral solutions and will support the standard once our technical concerns are addressed." See the NIST announcement and Additional References.
See also: BetaNews
SCO Goes Down in Flames: Novell Owns Unix
Steven J. Vaughan-Nichols, Linux Watch
The day Linux fans have been waiting for since SCO attacked Linux on May 12, 2003 has finally arrived. U.S. District Court Judge Dale Kimball has ruled that Novell, not SCO, owns Unix's IP (intellectual property) rights. This, in turn, means the end of SCO's cases against IBM. In his 102-page decision, Kimball went on to rule that "SCO is obligated to recognize Novell's waiver of SCO's claims against IBM and Sequent", Thus, not only does Novell own Unix, SCO's cases against IBM have essentially been destroyed. Putting salt into SCO's wounds, Kimball also ruled that while "The court ... is precluded from granting a constructive trust with respect to the payments SCO received under the 2003 Sun and Microsoft Agreements because there is a question of fact as to the appropriate amount of SVRX Royalties SCO owes to Novell based on the portion of SVRX products contained in each agreement." In short, SCO owes Novell at least some of the funds it received from its Microsoft and Sun Unix licensing deals, which it used to fuel its anti-Linux lawsuits. In the end, what did SCO in was not its ever more shaky IP claims against Linux, but the contract that gave SCO the right to sell and market Unix, but not its IP rights. Since February of 2004, Novell had been insisting that in the original APA (Asset Purchase Agreement) and Amendment No.2 to the APA, it never sold Unix's IP to SCO.
Selected from the Cover Pages, by Robin Cover
An announcement from The Open Geospatial Consortium (OGC) describes the publication of the "OpenGIS Transducer Markup Language (TML) Implementation Specification" as an approved OpenGIS Publicly Available Standard. Sensor systems "have two basic parts: a sensing element and a transducer that converts energy from one form to another form (usually an electric signal) that can be interpreted. The OGC TML specification defines the conceptual model and XML Schema for describing transducers and supporting real-time streaming of data to and from sensor systems. TML thus defines (a) a set of models describing the response characteristics of a transducer, and (b) an efficient method for transporting sensor data and preparing it for use with other data through spatial and temporal associations." TML response models "are formalized XML descriptions of known hardware behaviors. The models can be used to reverse distorting effects and return artifact values to the phenomena realm. TML provides models for a transducer's latency and integration times, noise figure, spatial and temporal geometries, frequency response, steady-state response and impulse response. Traditional XML wraps each data element in a semantically meaningful tag. The rich semantic capability of XML is in general better suited to data exchange rather than live delivery where variable bandwidth is a factor. TML addresses the live scenario by using a terse XML envelope designed for efficient transport of live sensor data in groupings known as TML clusters. It also provides a mechanism for temporal correlation to other transducer data." The TML Implementation Specification has been produced as part of the OGC Sensor Web Enablement activity. In this effort, OGC members are "specifying interoperability interfaces and metadata encodings that enable real time integration of heterogeneous sensor webs into the information infrastructure. Developers will use these specifications in creating applications, platforms, and products involving Web-connected devices such as flood gauges, air pollution monitors, stress gauges on bridges, mobile heart monitors, Webcams, and robots as well as space and airborne earth imaging devices."
XML Daily Newslink and Cover Pages are sponsored by:
|BEA Systems, Inc.||http://www.bea.com|
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: firstname.lastname@example.org
Newsletter unsubscribe: email@example.com
Newsletter help: firstname.lastname@example.org
Cover Pages: http://xml.coverpages.org/