This issue of XML Daily Newslink is sponsored by:
BEA Systems, Inc. http://www.bea.com
- Anti-SPIT : A Document Format for Expressing Anti-SPIT Authorization Policies
- W3C Launches Security Specifications Working Group
- Where is XML Going?
- Planning to Upgrade XSLT 1.0 to 2.0, Part 5: Make Your Stylesheets Work With Any Processor Version
- Web Services Policy 1.5: W3C Call for Implementations
- Microsoft Closer on 'Office Open' Blessing
- Assisting Novice Analysts in Developing Quality Conceptual Models with UML
- U.K. Launches Open-Source Policy Group
- RIAA Opposes New Fair Use Bill
Anti-SPIT : A Document Format for Expressing Anti-SPIT Authorization
Hannes Tschofenig, Dan Wing (et al.), IETF Internet Draft
The problem of SPAM for Internet Telephony (SPIT) is an imminent challenge and only the combination of several techniques can provide a framework for dealing with unwanted communication. The responsibility for filtering or blocking calls can belong to different elements in the call flow and may depend on various factors. This document defines an authorization based policy language that allows end users to upload anti-SPIT policies to intermediaries, such as SIP proxies. These policies mitigate unwanted SIP communications. It extends the Common Policy authorization framework with additional conditions and actions. The new conditions match a particular Session Initiation Protocol (SIP) communication pattern based on a number of attributes. The range of attributes includes information provided, for example, by SIP itself, by the SIP identity mechanism, by information carried within SAML assertions. A SPIT authorization document is an XML document, formatted according to the schema defined in RFC 4745 (Common Policy: A Document Format for Expressing Privacy Preferences). SPIT authorization documents inherit the MIME type of common policy documents, 'application/auth-policy+xml'. This document is composed of rules which contain three parts: conditions, actions, and transformations. Each action or transformation, which is also called a permission, has the property of being a positive grant to the authorization server to perform the resulting actions, be it allow, block etc . As a result, there is a well-defined mechanism for combining actions and transformations obtained from several sources. This mechanism therefore can be used to filter connection attempts thus leading to effective SPIT prevention. Policies are XML documents that are stored at a Proxy Server or a dedicated device. The Rule Maker therefore needs to use a protocol to create, modify and delete the authorization policies defined in this document. Such a protocol is available with the Extensible Markup Language (XML) Configuration Access Protocol (XCAP). Referenced I-Ds: (1) SPAM for Internet Telephony (SPIT) Prevention using the Security Assertion Markup Language (SAML); (2) Conveying CPC Using the SAML.
W3C Launches Security Specifications Working Group
Staff, W3C Announcement
W3C has announced the creation of the XML Security Specifications Maintenance Working Group. Frederick Hirsch (Nokia) chairs the group, which is chartered to perform maintenance work on Recommendations from the XML Signature and XML Encryption families of security specifications. Past W3C security work was focused on foundation technologies: The joint IETF-W3C XML Signature Working Group specified mechanisms to digitally sign XML documents and other data, and to encapsulate digital signatures in XML. The W3C XML Encryption Working Group specified mechanisms to encrypt XML documents and other data, and to encapsulate the encrypted material and related meta-information in XML. Canonical XML 1.0 is required for implementations of the XML Signature Syntax and Processing Recommendation. The XML Core Working Group is chartered to revise Canonical XML, and is publishing the result as Canonical XML 1.1. This version of Canonical XML will address incompatibilities between that specification and the xml:id and xml:base Recommendations, and possible future attributes in the 'xml' namespace. It is also known that the Decryption Transform for XML Signature Recommendation includes processing of xml namespace attributes that is analogous to that in Canonical XML 1.0, and leads to similar issues. This Working Group is chartered to update the XML Signature Syntax and Processing Recommendation and the Decryption Transform for XML Signature Recommendation to be compatible with the evolving XML environment. The update will also take known errata into account. These include the XML Signature Errata and the Decryption Transform Errata. The Working Group is also chartered to collect and study additional issues with the XML Encryption and XML Signature suite of specifications, and to propose a draft charter for work to address these issues. It is anticipated that the Working Group will collaborate with: (1) Internet Engineering Task Force (IETF Public-Key Infrastructure Working Group—PKIX); (2) OASIS (Security Services TC and Web Services Security TC); (3) Liberty Alliance.
See also: the W3C Security Home
Where is XML Going?
Kurt Cagle, O'Reilly Opinion
What I see is the rise of core XML technologies—XSLT being the biggest by far, but even XQuery and XForms are beginning to show up -- in the job listings. There's a rising demand for XSLT developers right, and I suspect that as people begin to start incorporating XSLT 2.0 that this will increase still further (and should - XSLT 2 is just, better). XHTML is becoming 'standard', at least at the corporate level, and this in turn is feeding back into the rise of the manipulation tools that XHTML opens up. I've even seen a rise in demand for ontologists and RDF specialists, as people begin to harness the ability to create metadata structures. It should be interesting to see what happens as RDFa begins to become more regularly incorporated into XHTML generation tools... Microsoft has greenlighted a version of XSLT 2.0 to be deployed with the next version of Visual Studio. My suspicion here is that if Red Hat hasn't also raised the possibility of pushing libXSLT to the 2.0 level then it should do so: XSLT2 leaves XSLT1 in the dust in terms of both usability and performance, and I suspect that we'll see more commercial level XSLT2 transformations appear over the course of the next year. The complexity, verbosity and need to resort to recursive programming has signficantly limited XSLT 1.0 adoption over the years—by largely eliminating the need for unnecessarily recursion and cutting down the amount of code by a factor of three or more to accomplish most transformations, XSLT 2.0 saves development cycles, more readily integrates with third party extensions... To me, the interesting things being done in XML are increasingly occurring in the application and vertical markets: HL7 in the health care field, GML in the mapping and geographical location space, XBRL or UBL in the business space and so forth represent sophisticated ontologies rather than formal schemas, the understanding being that getting complete agreement on the structure of any field of endeavor may be doomed to failure between differing participants, but you can (for the most part) agree upon the general terminology and language for those core concepts. This approach actually makes a great deal of sense -- a dictionary is a self-descriptive ontology...
Planning to Upgrade XSLT 1.0 to 2.0, Part 5: Make Your Stylesheets
Work With Any Processor Version
David Marston and Joanne Tong, IBM developerWorks
XSLT 2.0, the latest specification released by the W3C, is a language for transforming XML documents. It includes numerous new features, with some specifically designed to address shortcomings in XSLT 1.0. This article provides examples of stylesheets that are portable between versions 1.0 and 2.0, with special guidance for those who must run both 1.0 and 2.0 processors for a long transition period. The new 2.0 features may occur in the form of instruction elements, declaration elements, XPath operators, functions, or new attributes or children on elements that existed in 1.0. For each form of enhancement, only certain techniques apply. The article spells out the consequences of retaining a 1.0 processor in production use while you begin to use new 2.0 features in your stylesheets. Your users who are served through an XSLT 2.0 processor should gain better performance or appearance, though they won't realize that the stylesheets are becoming more readable and maintainable. Once all the 1.0 processors are replaced by 2.0 processors, you can rip out the old code and the guarding mechanisms that caused the 1.0 processors to select it. Ripping out old code is especially easy if it was already separated by imported modules, as shown in the example. In this series of articles, you'll get a high-level overview and an in-depth look at XSLT 2.0 from the point of view of an XSLT 1.0 user who wants to fix old problems, learn new techniques, and discover what to look out for. Examples derived from common applications and practical suggestions are provided if you wish to upgrade.
Web Services Policy 1.5: W3C Call for Implementations
A. Vedamuthu, D. Orchard, F. Hirsch (et al., eds), W3C TRs
W3C has announce the advancement of Web Services Policy 1.5 to the status of Candidate Recommendation, indicating that the document is believed to be stable, and to encourage implementation by the developer community. The W3C Web Services Policy Working Group expects to request that the Director advance this document to Proposed Recommendation once the Working Group has demonstrated four or more interoperable implementations, with the exception of ignorable policy assertions, which shall require two or more implementations. The Working Group does not plan to request to advance to Proposed Recommendation prior to 30-June-2007. The Policy Framework defines a model for expressing the nature of Web services in order to convey conditions for their interaction. A policy is a collection of policy alternatives, where a policy alternative is a collection of policy assertions. A policy assertion represents a requirement, capability, or other property of a behavior. A policy expression is an XML Infoset representation of its policy, either in a normal form or in its equivalent compact form. Some policy assertions specify traditional requirements and capabilities that will manifest themselves in the messages exchanged (e.g., authentication scheme, transport protocol selection). The "Attachment" specification defines how to associate policies, for example within WSDL or UDDI, with subjects to which they apply.
See also: W3C Web Services Activity
Microsoft Closer on 'Office Open' Blessing
Michael Hickins, InternetNews.com
Microsoft is one step closer to having its Office Open XML (OOXML) format adopted as an international standard in short order. Ecma, the standards body pushing OOXML on behalf of Microsoft, confirmed that it made today's deadline for responding to comments on its initial application for fast-track approval to the International Standards Organization (ISO). The matter now rests with the secretariat of the ISO joint technical committee (JTC1) reviewing Microsoft's application, said Jan van den Beld, secretary general of Ecma. The best-case scenario for Microsoft is that the secretariat allows a full vote on the standard under the organization's fast-track process, which will take five months to complete. Van den Beld noted that votes on standards generally take up to thirty-six (36) months. Opponents to OOXML, which include IBM and the Open Document Foundation, have argued that Microsoft's specifications are unwieldy and that the standard application is redundant with the Open Document Format (ODF), which already exists. Microsoft has countered that the OOXML format is valuable because it is closer to Office 2007 and is backwards-compatible with older versions of Office. "Although both ODF and Open XML are document formats, they are designed to address different needs in the marketplace," the company wrote in an open letter published earlier this month.
Assisting Novice Analysts in Developing Quality Conceptual Models
Narasimha Bolloju and Felix S.K. Leung, ACM Queue (CACM Reprint)
Knowing the kinds of modeling errors they are most likely to produce helps prepare novice analysts for developing quality conceptual models. During the analysis phase of information systems development, systems analysts capture and represent systems requirements using conceptual models (such as entity-relationship diagrams, class diagrams, and use case diagrams). Considering the fact that the reported failures of a significant percentage of developed systems are linked to faulty requirements, it is extremely important for these analysts and critical to the system's ultimate success to ensure the quality of the conceptual models they develop in the early phases of systems development. However, developing good-quality conceptual models is a challenge for many analysts... Though UML is widely used, the UML diagrams are not highly rated by analysts in terms of usability. Here, we present the results of an empirical study we conducted aimed at identifying the most typical set of errors frequently committed by novice systems analysts in four commonly used UML artifacts—use case diagrams, use case descriptions, class diagrams, and sequence diagrams—and discuss how they affect the quality of artifacts developed. Ensuring that artifacts are free of such errors helps novice analysts develop better-quality UML artifacts. Our findings are relevant to instructors of systems analysis courses, software quality-assurance teams, CASE tool developers, and researchers in the field of conceptual modeling, as well as to the analysts themselves.
See also: Conceptual Modeling and Markup Languages
U.K. Launches Open-Source Policy Group
Jeremy Kirk, InfoWorld
A new U.K. think tank will analyze how open-source software can be used in government and the private sector. The National Open Center will be the U.K.'s first organization dedicated to studying open-source software issues. The National Open Center, based in Birmingham, England, will be composed of working groups that will study open-source issues, such as the use of standards and procurement guidelines, according to Ed Downs, of the National Computing Center, a professional IT membership organization. Unlike other European countries, the U.K. lacked an organization dedicated to studying open-source software, he said. Many of the innovations on the Internet would not have been possible without open-source software like the Apache Web server, said Scott Thompson, one of the founders of the National Open Center; companies like Google and Yahoo use open source to drive down their software costs. Acccording to the Web site description: "The National Open Centre will have a schedule of events organised around the key issues that they are addressing, as determined by the Advisory Board. This will involve a call for participation including subject panels, an exchange of ideas, one or more seminars and ultimately the dissemination of the resulting paperss. The overall work plan will be guided by the Advisory Board which will identify key OS&S related issues for the UK which can benefit from clarification and the development of strategic thinking. This board will seek to represent a broad range of expertise and interest including, but not limited to, the OS community, vendors and users, those with interests in software, standards and society, and standards, public, private and voluntary sectors, commercial, new media and embedded applications."
See also: the National Open Centre
RIAA Opposes New Fair Use Bill
Grant Gross, InfoWorld
A new bill in the U.S. Congress aimed at protecting the fair use rights for consumers of copyright material would "legalize hacking," the Recording Industry Association of America (RIAA) said. The Freedom and Innovation Revitalizing U.S. Entrepreneurship (FAIR USE) Act, introduced Tuesday by U.S. Representatives Rick Boucher, a Virginia Democrat, and John Doolittle, a California Republican, would allow customers to circumvent digital copy restrictions in six limited areas when copyright owners' business models are not threatened, Boucher said in a press release. So-called fair use doctrine allows customers of copyright works to make limited numbers of copies, particularly for reviews, news reporting, teaching and research. The bill would allow exemptions to the anticircumvention restrictions in the Digital Millennium Copyright Act (DMCA), passed by Congress in 1998. The bill is revamped from similar bills introduced in the last two sessions of Congress. The Boucher bill would limit the availability of statutory damages against individuals and firms who may be found to have engaged in contributory infringement, inducement of infringement, or other indirect infringement. The bill would allow libraries to circumvent digital locks or secure copies of works that have been damaged, lost or stolen. The Consumer Electronics Association applauded the bill, saying it would give protections to consumers, educators, and libraries. Without fair use protections, consumers couldn't use devices such as VCRs and digital TV recorders, the trade group said. Boucher: "The fair use doctrine is threatened today as never before; historically, the nation's copyright laws have reflected a carefully calibrated balanced between the rights of copyright owners and the rights of the users of copyrighted material. The Digital Millennium Copyright Act dramatically tilted the copyright balance toward complete copyright protection at the expense of the public's right to fair use."
See also: XML and DRM
XML Daily Newslink and Cover Pages are sponsored by:
|BEA Systems, Inc.||http://www.bea.com|
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: email@example.com
Newsletter unsubscribe: firstname.lastname@example.org
Newsletter help: email@example.com
Cover Pages: http://xml.coverpages.org/