This issue of XML Daily Newslink is sponsored by:
- New Public Review Issues for Unicode Standard Annexes (UAX)
- Commentary on Unicode Normalization Form C (NFC)
- Proposal for an OpenID User Interface Work Group
- "ODF-Next" Call for Proposals: Open Document Format (ODF) Post v1.2
- Public Comment Invited for Open Grid Forum Proposal on Social Networks
- OASIS Interoperable Collaboration Services TC Inaugural Meeting
- Security Assessment of the Transmission Control Protocol (TCP)
- OASIS eGov Member Section Announces Steering Committee Election Results
- Emergency Information Interoperability Frameworks
New Public Review Issues for Unicode Standard Annexes (UAX)
Rick McGowan, Unicode Consortium Announcement
Members of the Unicode Consortium announced the availability of several Unicode Version 5.2 Standard Annexes (UAX). A Unicode Standard Annex (UAX), one type of Unicode Technical Report, "forms an integral part of the Unicode Standard, but is published online as a separate document. The Unicode Standard may require conformance to normative content in a Unicode Standard Annex, if so specified in the Conformance chapter of that version of the Unicode Standard. The version number of a UAX document corresponds to the version of the Unicode Standard of which it forms a part." From time to time the Unicode Consortium seeks wide public review and feedback for certain proposed actions. The purpose of the review is to elicit better information on the practical impact of such proposals on users or implementers as well as broaden the review of technical details. Any feedback on Public Review Issues will be used in the deliberations of the relevant Unicode Consortium technical committee. New proposed updates for Unicode Standard Annexes (UAX) are now open for review for Unicode version 5.2. Public Review Issues have been posted for all of those currently available, but the review period is just starting and most have boilerplate updates only. Proposed updates for the UAXes can be found on a new web page. The UAXes currently available for review are: UAX 9 (Unicode Bidirectional Algorithm); UAX 11 (East Asian Width); UAX 14 (Unicode Line Breaking Algorithm); UAX 15 (Unicode Normalization Forms); UAX 24 (Unicode Script Property); UAX 29 (Unicode Text Segmentation); UAX 31 (Unicode Identifier and Pattern Syntax); UAX 34 (Unicode Named Character Sequences); UAX 38 (Unicode Han Database (Unihan); UAX 41 (Common References for Unicode Standard Annexes); UAX 42 (Unicode Character Database in XML); UAX 44 (Unicode Character Database.
See also: Unicode Public Review Issues
Commentary on Unicode Normalization Form C (NFC)
Mark Davis, Unicode Consortium Announcement
Mark Davis, Chair of the Unicode Security Subcommittee and Bidi Subcommittee, announced the publication of a resource on Unicode NFC Normalization (Normalization Form C, where the 'C' is for compostion). His note: "In response to questions from some people in the W3C, I put together an FAQ on NFC normalization... I have some figures on performance and footprint in there as examples; if anyone else has figures from other implementations, I'd appreciate them..." For various reasons, Unicode sometimes has multiple representations of the same character. These sequences are called canonically equivalent. Normalizing to NFC is subject to stringent stability requirements to maintain backwards compatibility. For round-trip compatibility with existing standards, Unicode has encoded many entities that are really variants of the same abstract character. The Unicode Standard defines two equivalences between characters: canonical equivalence and compatibility equivalence. Canonical equivalence is a fundamental equivalency between characters or sequences of characters that represent the same abstract character, and when correctly displayed should always have the same visual appearance and behavior. Normalizing to NFC is not lossy. Unicode considers certain characters to be fundamentally (canonically) equivalent—the fact that there are multiple representations is an artifact of encoding. Normalizing to NFC maintains that canonical equivalence. Even in the case of CJK compatibility characters, they are also variants of the corresponding 'ordinary' character in that either character could appear in either form. As a matter of fact, the glyphic shape of the sources (eg JIS) has changed over time. The Unicode Consortium does recognize that particular glyphic shapes are sometimes important, and has developed a much more comprehensive mechanism to deal with it... What is the difference is between W3C normalization and Unicode normalization? Unicode normalization comes in 4 flavors: C, D, KC, KD. It is C that is relevant for W3C normalization. W3C normalization also treats character references (&#nnnn;) as equivalent to characters. For example, the text string "a&#xnnnn;" (where nnnn = "0301") is Unicode-normalized since it consists only of ASCII characters, but it is not W3C-normalized, since it contains a representation of a combining acute accent with "a", and in normalization form C, that should have been normalized to U+00E1.
See also: XML and Unicode
Proposal for an OpenID User Interface Work Group
Allen Tom, Posting to the OpenID Specifications Discussions List
An "OpenID User Interface Work Group Proposal" has been published, together with in invitation for public feedback. The proposers include representatives from Yahoo!, Janrain, Six Apart, Vidoop/DiSo Project, Google, and Facebook. "This workgroup intends to produce a very brief OpenID extension to enable the OpenID Authentication User Interface to be invoked in a standalone popup window, and to allow the Relying Party to request that the user interface be displayed in a particular language... OpenID traditionally requires the Relying Party to redirect the entire browser window to the OpenID Provider for the user to authenticate before redirecting the browser back to the Relying Party. It is believed that the User Experience (UX) could be significantly improved if the authentication flow occurred within a smaller popup window, making the experience less disruptive to the user. Although it is possible for Relying Parties to open a popup window for the user to authenticate at the OpenID Provider using the Provider's default user interface, the overall user experience can be optimized if the OP was aware that its UI was running within a popup. For instance, an OP may want to resize the popup browser window when using the popup interface, but would probably not want to resize the full browser window when using the default redirect interface. Another optimization is that the OP can close the popup, rather than return a negative assertion if the user chooses to cancel the authentication request. Users who begin the OpenID sign in process on a Relying Party in one language and then transition to their OpenID Provider's site in a different language may find the overall experience to be very disruptive. In many cases, the Relying Party may want to pass a language hint to the OpenID Provider to use to display the User Interface to the user, especially if the user is not already authenticated at the OP."
See also: the OpenID announcement
"ODF-Next" Call for Proposals: Open Document Format (ODF) Post v1.2
Staff, OASIS Announcement
OASIS members and all interested participants are invited to help define the feature set of the next revision of OASIS Open Document Format (ODF) to follow ODF 1.2. "The OASIS Open Document Format for Office Applications (ODF) TC has recently finalized the approval of technical proposals which will be included in ODF 1.2. After these proposals are integrated into the ODF 1.2 draft and remaining editorial tasks are completed, the draft specification will be sent on for Public Review (hopefully before the end of April 2009) and then finally onto a ballot for approval as an OASIS Standard. Concurrent with these final stages of ODF 1.2 work—which may take several months—the ODF TC wishes to start a public conversation related to the technical contents of "ODF-Next", the provisional name for the next major version of ODF. The ODF TC has created the ODF Requirements Subcommittee, chaired by Bob Jolliffe, to gather, categorize and prioritize proposals for 'ODF-Next', and to report back to the ODF TC with recommendations. The target for this report is May 01, 2009. We wish to start gathering proposals now, from TC members, from the OASIS community, from ODF implementers, government, academia, and from the public at large. What features should be added? What capabilities do we need? Where do we want to take ODF next? We want you to think broadly and boldly on where we can take this standard in the next major revision..."
See also: the OpenDocument XML.org Focus Area
Public Comment Invited for Open Grid Forum Proposal on Social Networks
Gregory Newby, OGF Announcement
On behalf of the Open Grid Forum (OFG) Grid Information Retrieval Research Group, Gregory Newby announced the public availability of an Informational Document "A Framework of Online Community based Expertise Information Retrieval on Grid." The authors are Eui-Nam Huh, Pil-Woo Lee, and Greg Newby. Public comment is invited through March 18, 2009. From the document abstract: "Web-based online communities such as blogs, forums and scientific communities have become important places for people to seek and share expertise. Search engines such as Google, Yahoo!, Live etc. are not yet capable to address queries that require deep semantic understanding of the query or the document... There is no universal standard data structure for the outline of user participation in these communities. Also, as these communities rarely interoperate, each typically only has access to its own social data and cannot benefit from other communities' data. Extracting, aggregating and analyzing data from these communities for finding experts on a single framework is a challenging task. In this document, we present a Grid-enabled framework of expertise search (GREFES) engine, which utilizes online communities as sources for experts on various topics. We suggest an open data structure called SNML (Social Network Markup Language) to outline user participation in online communities. The architecture addresses major challenges in crawling of community data and query processing by utilizing the computational power and high bandwidth inherently available in the Grid... This new framework utilizes online web communities as sources of experts on various topics. The framework specifies an open data structure called SNML for sharing community data efficiently and effectively. The architecture addresses major challenges in crawling online community data and query processing by utilizing the computational power and high bandwidth available in the Grid. Several open APIs are described so that people can build new solutions utilizing the framework... Social Network Markup Language (SNML) is an XML-based open data structure to outline user participation in online communities. Crawlers collect SNML documents, which describe community data in a universal format. SNML documents contain community information, user's participation information such as experiences, expertise areas, activities, relations, etc. SNML consist of four sections: (1) Community: The community a user belongs to. It consists of community type and community profile. Community type describes type of medium, such as blog, wikis, forums, etc., and community profile describes the community activities or services. (2) User Profile: Consists of a basic profile and an extensible profile. Basic profile includes personal information such as name, age, expertise area, phone, address, etc. Extensible profile depends on different social network or communities. For example, Facebook's profile data includes information other than the basic profile. (3) Relation: Consists of user profiles connected to the current user and their depth. It may include the number of friends connected to the user, their profiles, etc. (4) Activities: Consists of an activity name and media URI (Universal Resource Identifier). It includes user comments, posts, etc. Besides SNML, this architecture supports open APIs that are like REST APIs. Our system, as well as third party providers or developers, can use these APIs to develop new solutions..." [Note: Also with comment period ending March 10, 2009: "OGSA-DMI Plain Web Service Rendering Specification 1.0"]
See also: OGSA-DMI
OASIS Interoperable Collaboration Services TC Inaugural Meeting
Staff, OASIS Announcement
Prospective members of the new OASIS Integrated Collaboration Object Model for Interoperable Collaboration Services (ICOM) Technical Committee will hold an initial TC meeting on March 03, 2009. The telephone meeting is sponsored by Oracle. Interested parties are reminded that the last opportunity to register for initial membership in this TC is Tuesday, February 24, 2009 (11:59 PM EST). This new OASIS TC has been chartered to define an integrated collaboration object model supporting a complete range of enterprise collaboration activities. The new standard model, interface, and protocol will support contextual collaboration within business processes for an integrated collaboration environment which includes communication artifacts (e.g., email, instant message, telephony, RSS), teamwork artifacts (such as project and meeting workspaces, discussion forums, real-time conferences, presence, activities, subscriptions, wikis, and blogs), content artifacts (e.g., text and multi-media contents, contextual connections, taxonomies, folksonomies, tags, recommendations, social bookmarking, saved searches), and coordination artifacts (such as address books, calendars, tasks) etc. Industry commentary supports the conviction that convergence between online collaboration, communication, and content is now seen as a strong desideratum. The ICOM domain model is intended to support that convergence.
See also: the ICOM TC 'join' page
Security Assessment of the Transmission Control Protocol (TCP)
Fernando Gont (ed), IETF Internet Draft
"The TCP/IP protocol suite was conceived in an environment that was quite different from the hostile environment they currently operate in. However, the effectiveness of the protocols led to their early adoption in production environments, to the point that, to some extent, the current world's economy depends on them... While the Internet technology evolved since it early inception, the Internet's building blocks are basically the same core protocols adopted by the ARPANET more than two decades ago. During the last twenty years, many vulnerabilities have been identified in the TCP/IP stacks of a number of systems. Some of them were based on flaws in some protocol implementations, affecting only a reduced number of systems, while others were based in flaws in the protocols themselves, affecting virtually every existing implementation... The discovery of vulnerabilities in the TCP/IP protocol suite usually led to reports being published by a number of CSIRTs (Computer Security Incident Response Teams) and vendors, which helped to raise awareness about the threats and the best mitigations known at the time the reports were published. Unfortunately, this also led to the documentation of the discovered protocol vulnerabilities being spread among a large number of documents, which are sometimes difficult to identify... There is a clear need for a companion document to the IETF specifications that discusses the security aspects and implications of the protocols, identifies the existing vulnerabilities, discusses the possible countermeasures, and analyzes their respective effectiveness. This document is the result of a security assessment of the IETF specifications of the Transmission Control Protocol (TCP), from a security point of view. Possible threats are identified and, where possible, countermeasures are proposed. Additionally, many implementation flaws that have led to security vulnerabilities have been referenced in the hope that future implementations will not incur the same problems. This document does not aim to be the final word on the security aspects of TCP. On the contrary, it aims to raise awareness about a number of TCP vulnerabilities that have been faced in the past, those that are currently being faced, and some of those that we may still have to deal with in the future." The Technical Note 'TN0309' (130 pages) "Security Assessment of the Transmission Control Protocol (TCP)" provides an expanded discussion on this topic.
See also: the CPNI Technical Note
OASIS eGov Member Section Announces Steering Committee Election Results
Staff, OASIS Announcement
"The OASIS eGovernment Member Section (MS) is pleased to announce six newly elected Steering Committee members. Mr. Sunday, Government of Canada and Mr. Wallis, Government of New Zealand will each serve two year terms as representatives of the OASIS AGA members. Mr. Borras, Pensive SA and Mr. Barnhill, Booz Allen Hamilton will each serve two year terms as representatives of the OASIS business enterprise members. Mr. Swinnen, Deloitte and Mr. Schultz, ARS APERTA will each serve a one year term as representatives of the OASIS business enterprise members. The OASIS eGovernment Member Section (eGov MS) serves as a focal point for discussions of governmental and public administration requirements for e-business standardization. Bringing together representatives from global, regional, national and local government agencies, the eGov MS provides a platform for those who share a common interest in directing and understanding the impact of open standards on the public sector. Congratulations to the new members of the Steering Committee and sincere gratitude to all the MS members that cast their vote in this election process. OASIS would like to thank Arnaud Martens, Peter F. Brown, Steve Jones, Sri Gopalan, and Honbo Zhou, for their devotion and service to the eGov Member Section Steering Committee."
See also: the OASIS eGov MS web site
Emergency Information Interoperability Frameworks
Renato Iannella, Gary Berg-Cross (et al), W3C Incubator Group Report
This report presents an interoperability information framework for emergency management. This provides a reference model for information interoperability across the stakeholder functions in emergency management. The report looks at the issues facing the wider emergency management community and outlines some potential paths forwards via a number of informal and formal information models, scenarios, use cases, and ontology directions. The initial Conceptual Mind Map shows 21 primary entities (each with many properties) with some explicit relationships between them. This is far from complete but shows the intricate inter-relationships that exist in emergency management information. The Mind Map shows common entities (such as People, Organisations, and Resources) as well as the more esoteric (such as Animals and Policy). All of these are important in different contexts to different stakeholders in emergency management.... Background: "The management of emergencies is an endeavour that is characterised by involvement from a multitude of stakeholders, including numerous government agencies, military groups, non-government and charitable organisations, private enterprise and community groups. Some jurisdictions have attempted to integrate government response under a single emergency response agency, but although this can help to manage the logistics of interagency communication, the problem remains, particularly for non-governmental participants. Of the four commonly identified phases of emergency management—prevention/mitigation, preparation, response and recovery—response poses the clearest immediate need for efficient communication between agencies. However, each phase offers opportunities for improved communications, and indeed, the languages used and the problems faced have significant commonalities across all phases. The proliferation of participants poses challenges when trying to build information technology solutions to support the management of emergency operations. Without agreement on how stakeholders' information technology solutions can intercommunicate, the use of IT threatens to complicate rather than simplify the processes. The general consensus in IT is that the co-operation of disparate systems is best addressed through the use of standards, agreed-upon interfaces and protocols of communication that, when adhered to, should guarantee successful interaction with other systems. Although encouraging stakeholders to use standardised structures can make and has made great strides in garnering agreement on the structures of information being exchanged, the values for data that are being exchanged, the vocabularies being used by the different agencies, present a much greater challenge. There are numerous reasons that different stakeholders use different vocabularies. Different spoken languages, different universes of discourse, different concerns, can each lead to differing terminologies that make it very difficult for stakeholders to exchange information efficiently.
See also: XML and Emergency Management
XML Daily Newslink and Cover Pages sponsored by:
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: firstname.lastname@example.org
Newsletter unsubscribe: email@example.com
Newsletter help: firstname.lastname@example.org
Cover Pages: http://xml.coverpages.org/