The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: May 21, 2009
XML Daily Newslink. Thursday, 21 May 2009

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com



First Round of Interoperability Standards for Smart Grid Announced
William Jackson, Government Computer News

An initial set of sixteen (16) existing technical standards for control system interoperability and security has been identified for use in Smart Grid development. The list ("Recognized Standards for Inclusion in the Smart Grid Interoperability Standards Framework, Release 1.0") represents a consensus of government and industry stakeholders of what is needed to create a Smart Grid and is the first step in an aggressive three-phase program by the National Institute of Standards and Technology to develop key technical standards for an intelligent power distribution grid by the end of the year. NIST said the list is a work in progress and that other existing standards could be added. Additional standards will be created as needed. The Smart Grid program was established in the Energy Independence and Security Act of 2007, and the program has been identified as an important element of the Obama administration's economic recovery program with the promise of creating jobs, contributing to energy independence and curbing greenhouse gas emissions. The grid would use intelligent networking and automation to better control the flow and delivery of electricity to consumers. It is a fully automated power delivery network that monitors and controls every customer and node, ensuring a two-way flow of electricity and information between the power plant and the appliance, and all points in between, according to the Energy Department's National Vision for Electricity's Next 100 Years. Its distributed intelligence, coupled with broadband communications and automated control systems, enables real-time market transactions and seamless interfaces among people, buildings, industrial plants, generation facilities and the electric network...

NIST has outlined a three-phase approach to standards development: (1) Develop a consensus among utilities, equipment suppliers, consumers, standards developers and other stakeholders on needed standards, and produce a Smart Grid architecture, an initial set of standards to support implementation and plans for developing remaining standards by early fall, 2009; (2) Launch formal partnerships to develop the remaining needed standards; (3) Develop a program for testing and certification to ensure that Smart Grid equipment and systems comply with standards...

The Recognized Standards include: (1) "AMI-SEC System Security Requirements" for Advanced metering infrastructure (AMI) and Smart Grid end-to-end security; (2) "ANSI C12.19/MC1219" for Revenue metering information model; (3) "BACnet ANSI ASHRAE 135-2008/ISO 16484-5" for Building automation; (4) "DNP3 (Distributed Network Protocol)" for Substation and feeder device automation; (5) "IEC 60870-6 / TASE.2" for Inter-control center communications; (6) "IEC 61850" for Substation automation and protection; (7) "IEC 61968/61970" for Application level energy management system interfaces; (8) "IEC 62351 Parts 1-8" for Information security for power system control operations; (9) "IEEE C37.118" for Phasor measurement unit (PMU) communications; (10) "IEEE 1547" for Physical and electrical interconnections between utility and distributed generation (DG); (11) "IEEE 1686-2007" for Security for intelligent electronic devices (IEDs); (12) "NERC CIP 002-009" for Cyber security standards for the bulk power system; (13) "NIST Special Publication (SP) 800-53, NIST SP 800-82" for Cyber security standards and guidelines for federal information systems, including those for the bulk power system; (14) "Open Automated Demand Response (Open ADR)" for Price responsive and direct load control; (15) "OpenHAN" for Home Area Network device communication, measurement, and control; (16) "ZigBee/HomePlug Smart Energy Profile" for Home Area Network (HAN) Device Communications and Information Model.

See also: NIST's Recognized Standards List


Extensible Access Control Markup Language (XACML) Version 3.0
Erik Rissanen (ed), OASIS Public Review Draft

Members of the OASIS Extensible Access Control Markup Language (XACML) Technical Committee have approved the Committee Draft specification documents for XACML 3.0 and released them for public review. The 60-day public comment period ends July 20, 2009. The distribution package is made up of a Core prose specification document ("Extensible Access Control Markup Language (XACML) Version 3.0") and seven profiles. Named contributors to the Core document include: Anthony Nadalin, Bill Parducci, Daniel Engovatov, Erik Rissanen, Hal Lockhart, John Tolbert, Michiharu Kudo, Michael McIntosh, Ron Jacobson, Seth Proctor, Steve Anderson, and Tim Moses.

The motivation behind XACML is to express certain well-established ideas in the field of access control policy using an extension language of XML. Specifically, XACML provides: (1) a method for combining individual rules and policies into a single policy set that applies to a particular decision request; (2) a method for flexible definition of the procedure by which rules and policies are combined; (3) a method for dealing with multiple subjects acting in different capacities; (4) a method for basing an authorization decision on attributes of the subject and resource; (5) a method for dealing with multi-valued attributes; (6) a method for basing an authorization decision on the contents of an information resource; (7) a set of logical and mathematical operators on attributes of the subject, resource and environment; (8) a method for handling a distributed set of policy components, while abstracting the method for locating, retrieving and authenticating the policy components; (9) a method for rapidly identifying the policy that applies to a given action, based upon the values of attributes of the subjects, resource and action; (10) an abstraction-layer that insulates the policy-writer from the details of the application environment; (11) a method for specifying a set of actions that must be performed in conjunction with policy enforcement.

According to the specification's background statement: "The security policy of a large enterprise has many elements and many points of enforcement. Elements of policy may be managed by the Information Systems department, by Human Resources, by the Legal department and by the Finance department. And the policy may be enforced by the extranet, mail, WAN, and remote-access systems; platforms which inherently implement a permissive security policy. The current practice is to manage the configuration of each point of enforcement independently in order to implement the security policy as accurately as possible. Consequently, it is an expensive and unreliable proposition to modify the security policy. Moreover, it is virtually impossible to obtain a consolidated view of the safeguards in effect throughout the enterprise to enforce the policy. At the same time, there is increasing pressure on corporate and government executives from consumers, shareholders, and regulators to demonstrate 'best practice' in the protection of the information assets of the enterprise and its customers. For these reasons, there is a pressing need for a common language for expressing security policy. If implemented throughout an enterprise, a common policy language allows the enterprise to manage the enforcement of all the elements of its security policy in all the components of its information systems. Managing security policy may include some or all of the following steps: writing, reviewing, testing, approving, issuing, combining, analyzing, modifying, withdrawing, retrieving, and enforcing policy."

The seven profile documents in this public review include:

See also: the public review file listing


Internet X.509 Public Key Infrastructure: Certificate Image
Stefan Santesson, Russell Housley (et al, eds), IETF Internet Draft

Members of the IETF Public-Key Infrastructure (X.509) (PKIX) Working Group have published an initial level -00 Internet Draft specification for Internet X.509 Public Key Infrastructure: Certificate Image. The specification defines a method to bind a visual representation of a certificate in the form of a certificate image to an RFC 5280 public key certificate ("Internet X.509 Public Key Infrastructure Certificate and Certificate Revocation List (CRL) Profile") by defining a new otherLogos image type according to RFC 3709 ("Internet X.509 Public Key Infrastructure: Logotypes in X.509 Certificates"). From the document 'Introduction': "Introduction This standard specifies a Certificate Image that may be signed and referenced by a certificate as a visual representation of that certificate to humans. This standard makes use of the certificate Logotype extension defined in RFC 3709 and specifies the Certificate Image as a new otherLogos type. The purpose of the Certificate image is to enable a meaningful informational and visual experience for a human user in situations where a Graphical User Interface (GUI) of an application needs to show a certificate to a user. Typical situations when an application may want to show a certificate to a human are: (1) A person establishes contact with an authenticated entity, such as a commercial web site or government service. The person wants to see the authenticated identity of the service provider. (2) A person consumes signed information such as a signed e-mail, a signed document, or a signed contract. The person wants to see the authenticated identity of the signer. (3) A person is requested to authenticate to a service, or to sign some information, and is requested to select an appropriate certificate for the purpose. The person needs to see the available certificates to understand what type of personal data they contain and for what purpose they are intended. Unless an application recognizes the certificate type and has some predefined display logic for the certificate it displays, it will extremely hard for the application to provide meaningful information about the certificate to humans...

This draft recommends that certificate images are stored in a scalable format and specifically defines how to include images in PDF/A (ISO 19005) and SVG Tiny format (W3C Scalable Vector Graphics (SVG) Tiny 1.2 Specification). A third popular scalable vector graphic format VML (Vector graphic Markup Language) is not a public standard. Nevertheless, some implementers have chosen to support VML instead of SVG. It might therefore be useful to specify how to reference a VML formatted certificate image in an informational annex. A Certificate Image may be provided in the form of a Scalable Vector Graphic (SVG) image, which must follow the SVG Tiny profile. The following MIME media type defined in Appendix M of the W3C Recommendation must be used as mediaType in LogotypeDetails for SVG images: 'image/svg+xml'. The SVG image file must not incorporate any external image data by reference to an external SVG document, or by reference to an external media source other than SVG. Doing so would incorporate image data that is not covered by the logotypeHash value of the image. Certificate using applications must reject any image that violates this rule. The XML structure in the SVG file MUST use (the character) 'LF' (linefeed 0x0A) as end-of-line (EOL) character when calculating the hash over the SVG file. The referenced SVG file may be provided in compressed form, for example as SVG.GZ or SVGZ. It is outside the scope of this specification to specify any such compression algorithm. However, after decompression the EOL characters of the SVG file must be normalized according to this section before computing the hash of the SVG file. If a certificate image is provided as a bit mapped image, the PNG (ISO 15948) format should be used.

See also: the IETF Public-Key Infrastructure (X.509) (PKIX) Working Group


Updated Working Draft: Authoring Tool Accessibility Guidelines (ATAG) 2.0
Jan Richards, Jeanne Spellman, Jutta Treviranus (eds), W3C Technical Report

Members of the W3C Authoring Tool Accessibility Guidelines Working Group have published an updated Working Draft of "Authoring Tool Accessibility Guidelines (ATAG) 2.0" in preparation for the Last Call Working Draft. ATAG provides guidelines for designing Web content authoring tools that are more accessible for people with disabilities. An authoring tool that conforms to these guidelines will promote accessibility by providing an accessible user interface to authors with disabilities as well as by enabling, supporting, and promoting the production of accessible Web content by all authors. ATAG 2.0 defines an "authoring tool" as "any software application, part of an application, or collection of applications that authors interact with to create, modify or assemble Web content to be used by other people". The individuals and organizations that use ATAG 2.0 vary widely and include authoring tool developers, authoring tool users (authors), authoring tool purchasers, and policy makers. In order to meet the varying needs of this audience, several layers of guidance are provided including two parts, overall principles, general guidelines, testable success criteria and a collection of sufficient techniques and advisory techniques with examples and resource links.

See also: the W3C Authoring Tool Accessibility Guidelines Working Group (AUWG)


A Generalized Framework for Kerberos Pre-Authentication
Sam Hartman and Larry Zhu (eds), IETF Internet Draft

Members of the IETF Kerberos Working Group have published a revised Internet Draft for A Generalized Framework for Kerberos Pre-Authentication. This Standards Track specification, if approved, will update IETF RFC 4120 ("The Kerberos Network Authentication Service V5"). Kerberos is a protocol for verifying the identity of principals (e.g., a workstation user or a network server) on an open network. The Kerberos protocol provides a mechanism called pre-authentication for proving the identity of a principal and for better protecting the long-term secrets of the principal. The core Kerberos specification of RFC 4120 treats pre-authentication data as an opaque typed hole in the messages to the KDC that may influence the reply key used to encrypt the KDC reply. This generality has been useful: pre-authentication data is used for a variety of extensions to the protocol, many outside the expectations of the initial designers. However, this generality makes designing more common types of pre-authentication mechanisms difficult. Each mechanism needs to specify how it interacts with other mechanisms. Also, problems like combining a key with the long-term secrets or proving the identity of the user are common to multiple mechanisms. Where there are generally well-accepted solutions to these problems, it is desirable to standardize one of these solutions so mechanisms can avoid duplication of work. In other cases, a modular approach to these problems is appropriate. The modular approach will allow new and better solutions to common pre-authentication problems to be used by existing mechanisms as they are developed...

This document specifies a framework for Kerberos pre-authentication mechanisms. It defines the common set of functions that pre- authentication mechanisms perform as well as how these functions affect the state of the request and reply. In addition several common tools needed by pre-authentication mechanisms are provided. Unlike RFC 3961, this framework is not complete—it does not describe all the inputs and outputs for the pre-authentication mechanisms. Pre-Authentication mechanism designers should try to be consistent with this framework because doing so will make their mechanisms easier to implement. Kerberos implementations are likely to have plugin architectures for pre-authentication; such architectures are likely to support mechanisms that follow this framework plus commonly used extensions. This framework also facilitates combining multiple pre-authentication mechanisms, each of which may represent an authentication factor, into a single multi-factor pre-authentication mechanism. One of these common tools is the flexible authentication secure tunneling (FAST) padata type. FAST provides a protected channel between the client and the KDC, and it can optionally deliver a reply key within the protected channel. Based on FAST, pre-authentication mechanisms can extend Kerberos with ease, to support, for example, password authenticated key exchange (PAKE) protocols with zero knowledge password proof (ZKPP). Any pre-authentication mechanism can be encapsulated in the FAST messages as defined in Section 6.5. A pre-authentication type carried within FAST is called a FAST factor. Creating a FAST factor is the easiest path to create a new pre-authentication mechanism. FAST factors are significantly easier to analyze from a security standpoint than other pre-authentication mechanisms. Mechanism designers should design FAST factors, instead of new pre-authentication mechanisms outside of FAST.

See also: the IETF Kerberos Working Group Status Pages


Data.gov Web Site Launched by U.S. Federal Government
Matt Williams, Government Technology

U.S. Federal CIO Vivek Kundra has made good on his promise to launch Data.gov, which will make data generated by the U.S. federal government publicly available. It's apparent that Data.gov is only in the initial state of development. Day one featured an eclectic assortment of data sets, such as the locations of the world's copper smelters, National Weather Service advisories and weekly reports of earthquakes. Web site widgets for the FBI's 10 Most Wanted and the H1N1 swine flu virus are also featured. Kundra championed a similar Web site in his former position as the CIO of Washington, D.C.: The District of Columbia Data Catalog contains more than 275 data sets—everything from the locations of road kill pick-ups to crime incident reports. Like the D.C. Data Catalog, datasets available on Data.gov are retrievable in different formats like XML and ESRI Shapefile. Kundra has said that the aim of Data.gov will be to improve government transparency by releasing these data sets so that citizens are able to analyze them and build mash-up applications. Not all government data will be released on Data.gov, though. According to a policy statement posted on the Web site, all information must comply with privacy regulations. National security information will be unavailable...

Details from the FAQ document: "'Raw' Data Catalog: Data.gov features a catalog with instant view/download of platform-independent, machine readable data (e.g., XML, CSV, KMZ/KML, or shape file formats), as well as a link to a metadata page specific to the respective dataset. The metadata page will have additional links to authoritative source information from the sponsoring agency's website including any pertinent agency technical documentation regarding the dataset... Data.gov includes searchable catalogs that provide access to "raw" datasets and various tools. In the "raw" data catalog, you may access data in XML, Text/CSV, KML/KMZ, Feeds, XLS, or ESRI Shapefile formats. The catalog of tools links you to sites that include data mining and extraction tools and widgets. Datasets and tools available on Data.gov are searchable by category, agency, keyword, and/or data format. Once in the catalog, click on the "name" (i.e, the name of the dataset or tool of interest) and you will be taken to a page with more details and metadata on that specific dataset or tool... The geospatial datasets available on Data.gov are provided in up to three open file formats: Keyhole Markup Language (KML), Compressed Keyhole Markup Language (KMZ) and ESRI Shapefile. These datasets are all viewable in many commercial and freely available applications. More information about Geographic Information System (GIS) software can be found by doing a web search... A primary goal of Data.gov is to improve access to Federal data and expand creative use of those data beyond the walls of government by encouraging innovative ideas (e.g., web applications). Data.gov strives to make government more transparent and is committed to creating an unprecedented level of openness in Government. The openness derived from Data.gov will strengthen our Nation's democracy and promote efficiency and effectiveness in Government..."

See also: the Data.gov web site


Identity Developer Training Kit Based On Microsoft 'Geneva' Released
Dilip Krishnan, InfoQueue

Microsoft has released an identity developer training kit, following closely on the heels of the release of Geneva Beta 2 at Teched. The training kit is a set of hands-on labs and resources designed to help developers to take advantage of Microsoft's identity products and services. The Geneva Framework is the basis of the training kit; and also gives guidance on using it Geneva Server, Windows Live ID, the Microsoft Federation Gateway and the .NET Access Control Service. 'Geneva' is the code-name for Microsofts' claims-based access (CBA) platform strategy. It includes the "Geneva" Framework, "Geneva" Server, and Windows CardSpace "Geneva." The Geneva Framework provides developers with tools to build claims-based applications and services that involve tokens issued by a Security Token Service (STS), as well as tools for building a custom STS and for building Windows CardSpace-enabled applications. Vittorio Bertocci, an Architect Evangelist at Microsoft about the goals of the training kit: "We took special care to follow a progressive approach, in which we introduce concepts and ideas gradually: however we made sure that every single step is useful for solving a real-life problem... The idea was to cover many of the scenarios that we are often asked about in forums and customer discussions, but also to present things in the right order so that application developers can learn to use geneva framework without necessarily having to understand the entire stack. The kit tried to be respectful of that, but also kept into account the needs of the ones that want to know what really happens in the kitchen..." The training materials contains Hands on Labs (HOL), that are categorized by practical usage scenarios of the Geneva Framework. The training material is naturally geared towards the solutions on the Microsoft stack. Vittorio provides details of what is included in the training kit and mentions the various scenarios and topics covered in the HOL.

See also: Vittorio's blog


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2009-05-21.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org