This issue of XML Daily Newslink is sponsored by:
Sun Microsystems, Inc. http://sun.com
- A Method for Writing Testable Conformance Requirements
- Public Review: Content Management Interoperability Services (CMIS) v1.0
- Nation's Toughest Personal Info Law About to Take Effect
- W3C Report: Workshop on Access Control Application Scenarios
- A Step Toward a Secure Multi-Tenant Cloud
- Apache UIMA (Unstructured Information Management Architecture) v2.3.0
- New Initiative to Provide Security Standard for Smart Card Applications
- When Standards Bodies Are the Cyber Threat
A Method for Writing Testable Conformance Requirements
Dominique Hazaël-Massieux and Marcos Cáceres (eds), W3C Technical Report
Members of the W3C Mobile Web Initiative Test Suites Working Group have published a First Public Working Group Note for A Method for Writing Testable Conformance Requirements. This publication results from the collaboration between the Mobile Web Initiative Test Suites Working Group and the Web Applications Working Group on the development of test suites for the Widgets family of specifications. This collaboration aimed to improve the written quality and testability of various specifications. The applications, limitations, as well as possible directions for future work that could refine this method are described in this document. The specification presents a method for writing, marking-up, and analyzing conformance requirements in technical specifications. The editors argue that the method yields specifications whose conformance requirements are testable: that is, upon applying the method, parts of what is written in the specification can be converted into a test suite without requiring the use of a formal language...
When working on a specification, there are common mistakes an editor can make when writing conformance requirements that makes them difficult, if not impossible, to test. For technical specifications, the testability of a conformance requirement is imperative: conformance requirements eventually become the test cases that implementations rely on to claim conformance to a specification. If no implementation can claim conformance, or if aspects of the specification are not testable, then the probability of a specification becoming a ratified standard, and, more importantly, achieving interoperability among implementations, is significantly reduced.
Because conformance requirements are intertwined as part of the text of a specification (as sentences, paragraphs, dot points, etc.), it can be difficult to detect the various common mistakes. For this reason, the first step in our method is to identify and mark-up (using HTML) various structural components that constitute a conformance requirement. Understanding these structural components is important, because it is that structure that determines the testability of a conformance requirement. We discuss the structure of conformance requirements... Once conformance requirements have been marked up into their component parts, then they can be extracted and analyzed outside the context of the specification. Seeing a conformance requirement out of context can often expose inconsistencies and redundancies that may otherwise been difficult for the editor, or an independent reviewer, to identify. The ability to extract conformance requirements from a specification also allows them to be used in other contexts, such as in the creation of a test suite...
As the Web Applications Working Group learned, it can be problematic to enter the W3C's Candidate Recommendation phase without having a complete and thoroughly verified test suite: because this method was mostly applied during Candidate Recommendation, so many redundancies and issues where found that the specification had to drop back to Working Draft. This demonstrated that the method was effective, but needs to be applied as early as possible in the specification writing process...
A standards organization [...] plays a significant role in relation to the method: the standards organization provides access to a community of experts, as well as the tools that facilitate the interaction and communication between actors and the deliverables that are the outputs of a working group. Deliverables include the specification, testable assertions, and test cases that constitute the test suite. Actors include editors, test creators, QA engineers, implementers, and specification reviewers. Actors, which in many cases will be the same person in multiple roles, literally provide the intelligence that improves the quality of deliverables..."
Public Review: Content Management Interoperability Services (CMIS) v1.0
Staff, OASIS Announcement
An approved 'Committee Draft 06' of the OASIS Content Management Interoperability Services (CMIS) Version 1.0 specification was released for public review through February 12, 2010. This CD-06 version incorporates changes made to the specification in light of comments on the 'Committee Draft 04' public review release, which ended December 22, 2009. Changes are diff-marked in red.
The Content Management Interoperability Services (CMIS) standard "defines a domain model and Web Services and Restful AtomPub bindings that can be used by applications to work with one or more Content Management repositories/systems. The CMIS interface is designed to be layered on top of existing Content Management systems and their existing programmatic interfaces. It is not intended to prescribe how specific features should be implemented within those CM systems, not to exhaustively expose all of the CM system's capabilities through the CMIS interfaces. Rather, it is intended to define a generic/universal set of capabilities provided by a CM system and a set of services for working with those capabilities."
CMIS has already received broad acceptance by commercial and open-cource software developers, and has many other statements of support from the CMS community.
Nation's Toughest Personal Info Law About to Take Effect
William Jackson, Government Computer News
"Businesses that hold personally identifiable information on Massachusetts residents have one month to comply with what security experts are calling the toughest data security requirements in the nation. The Massachusetts Data Breach Law, passed in 2007, goes into effect March 1, 2010 and requires personal information in networked systems to be protected with strong encryption, firewalls, antivirus, and access controls.
The law was written in response to the theft of information on more than 45 million credit card accounts from TJX Companies in 2007. Hacker Albert Gonzalez pleaded guilty to the theft in August 2009. The law is designed to ensure the security and confidentiality of customer information, based on current industry standards, focusing on threats that can or should be anticipated. The regulations take into account the size of a business, the amount of resources available to it, the amount of personal data held and the sensitivity of the data. It covers paper and electronic records and requires physical and IT security...
Regulations also require encryption of personal data transmitted via public or wireless networks and stored on laptops or other portable devices. The law defines encryption as using at least 128-bit keys. Written security plans must cover physical and IT security, include a designated security manager, and cover everything from system monitoring to employee training..."
W3C Report: Workshop on Access Control Application Scenarios
Hal Lockhart and Rigo Wenning (eds), W3C Report
W3C has announced publication of a report and full minutes of the 'Workshop on Access Control Application Scenarios', held in Luxembourg in November 2009. Participants from seventeen (17) organizations examined the current limitations of access control, privacy enhancement, distributed handling of access control, and other challenging use cases. Extensible Access Control Markup Language (XACML) was a focus of the Workshop, though not the exclusive topic of conversation. The report summarizes the major "takeaways" from the Workshop, related to XACML semantics, "sticky" policies, and credentials-based access control. The OASIS XACML TC is expected to take up these topics. W3C's Policy Languages Interest Group (PLING) is expected to discuss data handling policies and the matching and triggering of events in the privacy context.
The workshop gathered many people with solutions for very specific issues. As was already mentioned in the call for participation, privacy was one of them. Privacy has a special relation to access control. So privacy friendly access control scenarios where presented. Mostly, they used XACML out of the box, but added the semantics needed. XACML creates interoperability by allowing a unified access control over a heterogenous IT landscape. But to expand to inter enterprise interoperability or to use it even more widely on the Internet, XACML needs semantics filling out its own framework that makes access control conditions predictable and interoperable even where there was no prior agreement on the semantics of the access control conditions.
An important new application area called data handling was represented in a number of workshop papers. Data handling refers to the distribution and storage of information relating to individuals. The primary requirement here is privacy protection. In order for privacy to be protected at all times, the privacy polices must travel with the data, and every party that receives and distributes the data must enforce them. This capability is referred to as sticky polices.
Credential based Access Control would allow for a more privacy friendly access control system that would also be more widely useable on the Web. The aim is to prove only selected attributes as need for the task at hand. There is already a large set of literature on capabilities, but XACML currently does not have the ability to identify the type of credential used nor to specify, which credential is needed to get access to a certain resource. This is more or less a special case of the attributes topic with additional protocol issues. One way to convey the credential would be to use SAML, but SAML only allows XML Signature as a proof token..."
A Step Toward a Secure Multi-Tenant Cloud
Ted Ritter, Network World
This week Cisco, NetApp, and VMware announced an integration model for a multi-tenant virtual infrastructure that stresses isolation at the virtual, CPU, network and storage levels... Concern over isolation failure is a major cloud security stumbling block. After reading through the 82-page document Designing Secure Multi-Tenancy into Virtualized Data Centers, I see this as a great step in the right direction... From the outset, the triad is clear that this is an integration of off-the-shelf products. There is no secret sauce cooked up here...
To achieve tighter integration we need the following: More prescriptive guidance on making LDAP a central authorization and authentication policy repository; leveraging standards like Extensible Access Control Markup Language (XACML) and Security Assertion Markup Language (SAML) for authentication and authorization policy management; and extending vCenter Orchestrator to support Cisco and NetApp. These moves turn a big step into a leap toward a secure multi-tenant cloud..."
According to the announcement: "The three companies have introduced an end-to-end Secure Multi-tenancy Design Architecture that provides enhanced security in cloud environments by isolating the information technology (IT) resources and applications of different clients, business units or departments that share a common IT infrastructure. As part of their collaboration, Cisco, NetApp and VMware will also offer a cooperative support model for these pretested and validated design architectures to help customers quickly build a unified, virtualized infrastructure.
Secure Multi-tenancy Design Architecture is an end-to-end, validated design architecture that isolates IT resources for enhanced security in shared virtual and enterprise cloud environments. The design architecture helps enterprise customers, systems integrators and service providers develop internal and external cloud services that isolate clients, business units, departments or security zones for enhanced security across the computing, networking, storage and management layers of a unified infrastructure. The Secure Multi-tenancy Design Architecture provides details about implementing and configuring the architecture, as well as best practices for building and managing best-in-class solutions from Cisco, NetApp and VMware. This validated design architecture significantly increases business agility by helping IT administrators to establish the appropriate quality of service for each resource layer and to deliver consistent service performance levels for the applications in each layer..."
See also: the Guide
Apache UIMA (Unstructured Information Management Architecture) v2.3.0
Staff, Apache UIMA Development Community Announcement
Apache UIMA is a framework supporting combining and reusing components that annotate unstructured information content such as text, audio, and video. Members of the Apache UIMA Incubator Project have announced the version 2.3.0 release, which consists of four packages: (1) UIMA Java SDK — the base framework, with development tools and examples; (2) UIMA-AS — Asynchronous Scalout capability; (3) UIMACPP — the c++ support framework, for components written in c++ and other languages; (4) UIMA Addons — a growing set of annotators and other tools.
The add-ons package contains many new components and annotators, including: Bean Scripting Framework supporting annotators written in popular scripting languages, Lucas (an interface to using UIMA with Apache Lucene), and TikaAnnotator (an annotator using the Apache Tika project text extractors). The UIMA-AS (Asynchronous Scaleout) framework is extensively enhanced with much more support for error/failure recovery, driven by feedback from actual use in several large scale deployments (1000's of nodes). The base framework now supports Java 5 generics, and is enhanced to make it even more light-weight and efficient; for example, it now supports a new network serialization format for communicating with remote annotators using a "delta-CAS" -- limiting the response sent to just those items which have changed.
Incubator summary page: "Apache UIMA is an Apache-licensed open source implementation of the UIMA specification that is, in turn, being developed concurrently by a technical committee within OASIS... Unstructured Information Management applications are software systems that analyze large volumes of unstructured information in order to discover knowledge that is relevant to an end user. An example UIMA application might ingest plain text and identify entities, such as persons, places, organizations; or relations, such as works-for or located-at.
UIMA enables applications to be decomposed into components, for example 'language identification' [to] 'language specific segmentation' [to] 'sentence boundary detection' [to] 'entity detection (person/place names etc.)'. Each component implements interfaces defined by the framework and provides self-describing metadata via XML descriptor files. The framework manages these components and the data flow between them. Components are written in Java or C++; the data that flows between components is designed for efficient mapping between these languages. UIMA additionally provides capabilities to wrap components as network services, and can scale to very large volumes by replicating processing pipelines over a cluster of networked nodes... UIMA additionally provides capabilities to wrap components as network services, and can scale to very large volumes by replicating processing pipelines over a cluster of networked nodes..."
See also: the Apache UIMA Incubator Project
New Initiative to Provide Security Standard for Smart Card Applications
Staff, Joint Company Announcement
Infineon Technologies, Inside Contactless, Oberthur, and Giesecke & Devrient have issued a joint announcement reporting on a new ndustry initiative to provide a new security solution for next-generation smart card based public transport applications. The solution will build on an open standard now being implemented by the four partner companies, which will eventually be governed by an independent body. Companies active in the smart card arena — providers of chips, smart cards, application-specific operating software, reader devices and transportation systems — are now invited to join the initiative for the advancement of more secure public transportation applications.
The industry initiative is based on groundwork performed by Infineon, the world's number one chip card IC (integrated circuits) provider. Infineon has developed a hardware-based security system specifically suited for public transportation smart card applications. It is comprised of a specific authentication scheme using the open and well-accepted Advanced Encryption Standard (AES) with 128-bit key length and file types and command sets based on the ISO/IEC 7816 standard. Employing AES, an encryption algorithm also used for commercial transactions, will significantly increase security over less-robust security schemes widely used in current public transportation systems. Using the encryption and secure messaging scheme for authentication, data encryption and Message Authentication Coding (MACing) allows high flexibility and fast adoption for different applications...
The new standard promises to bring a number of key benefits to both public transport agencies and smart card industry players, including higher performance and advanced system security for public transport applications, as well as the availability of multiple sources for chip products. Through independent testing, the open standard will also provide optimized interoperability to enable simple and fast integration into public transport schemes. The first emulation chips and transportation smart cards using this standard are scheduled to be available by the end of 2010...."
When Standards Bodies Are the Cyber Threat
A. M. Rutkowski, Network World
"[...] Crafting security standards involves multiple steps. First, experts agree on specifications intended to enhance cyber security. Then those specifications are made available to a community of implementers and the specifications are updated as flaws are discovered and evolutions become necessary. Next a responsible secretariat registers specific implementer technical parameters or schemas which are created by the standard, and finally that secretariat makes this information discoverable and readily available to all implementers...
Standards body cyber threats arise from three sources. The first stems from the fact that cyber security bodies typically exist within larger organizations that need revenue. Those organizations can hijack a specification and the so called "registered parameter" availability processes and charge often substantial sums of money to even view a specification or parameters. A second threat is that many bodies do not use readily available high trust (Extended Validation Certificate) Web platforms that ensure the integrity and security of the standard or registered parameters. The third threat is the failure of standards parameter registration authorities to implement sufficient identity proofing...
Standards bodies are part of the security food chain, and their practices must be part of an assessment process that holds them accountable. Those standards bodies that cannot meet today's needs and represent a threat should simply not be used as a deliberate decision by government and industry..."
XML Daily Newslink and Cover Pages sponsored by:
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: firstname.lastname@example.org
Newsletter unsubscribe: email@example.com
Newsletter help: firstname.lastname@example.org
Cover Pages: http://xml.coverpages.org/