The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: April 26, 2010
XML Daily Newslink. Monday, 26 April 2010

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Oracle Corporation http://www.oracle.com



NIST Releases Draft SCAP Version 1.0 Security Spec for Public Review
Stephen Quinn, John Banghart (et al, eds); NIST Interagency Report

The U.S. National Institute of Standards and Technology (NIST) announced a public review release of NIST IR-7511 Revision 2 through May 20, 2010. The document Security Content Automation Protocol (SCAP) Version 1.0 Validation Program Test Requirements describes the requirements that must be met by products to achieve SCAP Validation. This update to Draft NIST Interagency Report (IR) 7511 Revision 2 includes changes to the Internet Connectivity requirements and clarifying language to several other requirements and test procedures.

The Security Content Automation Protocol (SCAP) is a synthesis of interoperable specifications derived from community ideas, where these specifications have both intrinsic and synergistic value. They have intrinsic value in that the specification demonstrates value on its own merits. For example, XCCDF is a standard way of expressing checklist content. XCCDF also has a synergistic value when combined with other specifications such as CPE, CCE, and OVAL to create an SCAP-expressed checklist that can be processed by SCAP-validated products. Likewise, CVE has use cases in simply being a consistent way to enumerate vulnerabilities for tracking purposes; however, when combined with CPE and OVAL, CVE is elevated to formulate a greater use case, namely that of automated checks for vulnerabilities that can be processed by SCAP-validated products. It is important to recognize that specifications can and should demonstrate value in their own right without being SCAP specifications.

In SCAP Version 1.0, validation is awarded based on a defined set of SCAP capabilities and/or individual SCAP components by independent laboratories that have been accredited for SCAP testing by the NIST National Voluntary Laboratory Accreditation Program. Draft NISTIR 7511 Revision 2 has been written primarily for accredited laboratories and for vendors interested in receiving SCAP validation for their products.

The SCAP specification defines what SCAP's components (XCCDF, OVAL, CCE, CVE, CPE, and CVSS) are and how they relate to each other with in the context of SCAP. However, the SCAP specification does not define the SCAP components themselves; each component has its own standalone specification or reference...

See also: Application Security references


Facebook's Graph API: The Future Of Semantic Web?
Dilip Krishnan, InfoQueue

At the F8 conference in San Francisco. Facebook introduced Open Graph Protocol, and the Graph API as the next evolution in the Facebook platform. The Open Graph Protocol was originally created at Facebook and is inspired by Dublin Core, link-rel canonical, Microformats, and RDFa. Discussion of the Open Graph Protocol takes place on the developer mailing list. It is currently being consumed by Facebook and is being published by IMDb, Microsoft, NHL, Posterous, Rotten Tomatoes, TIME, Yelp, and others.

"On Facebook, users build their profiles through connections to what they care about—be it their friends or their favorite sports teams, bottles of wine, or celebrities. The Open Graph protocol opens up the social graph and lets your pages become objects that users can add to their profiles. When a user establishes this connection by clicking Like on one of your Open Graph-enabled pages, you gain the lasting capabilities of Facebook Pages: a link from the user's profile, ability to publish to the user's News Feed, inclusion in search on Facebook, and analytics through our revamped Insights product...

Facebook introduced three new components of Facebook Platform two of which the Open Graph protocol, and the Graph API. The API provides access to Facebook objects like people, photos, events etc. and the connections between them like friends, tags, shared content etc. via a uniform and consistent URI to access the representation. Every object can be accessed using the the URL 'https://graph.facebook.com/ID', where ID stands for the unique ID for the object in the social graph. Every connection (CONNECTION_TYPE) that the facebook object supports can be examined using the [designated] URL...

All of the objects in the Facebook social graph are connected to each other via relationships. The URIs also have a special identifier 'me' which refers to the current user. The Graph API uses OAuth 2.0 for authorization, where the authentication guide has details of the Facebook's OAuth 2.0 implementation. OAuth 2.0 is a simpler version of OAuth that leverages SSL for API communication instead of relying on complex URL signature schemes and token exchanges. At a high level, using OAuth 2.0 entails getting an access token for a Facebook user via a redirect to Facebook. After you obtain the access token for a user, you can perform authorized requests on behalf of that user by including the access token in your Graph API requests...

See also: the Open Graph Protocol specification web site


IETF First Public Working Draft: The OAuth 2.0 Protocol
Eran Hammer-Lahav, David Recordon, Dick Hardt (eds); IETF Internet Draft

IETF has published an initial level -00 Standards Track IETF Internet Draft for the specification The OAuth 2.0 Protocol. OAuth provides a method for making authenticated HTTP requests using a token—an identifier used to denote an access grant with specific scope, duration, and other attributes. Tokens are issued to third-party clients by an authorization server with the approval of the resource owner. OAuth defines multiple flows for obtaining a token to support a wide range of client types and user experience. The use of OAuth with any other transport protocol than HTTP or HTTP over TLS 1.0 as defined by RFC 2818 is undefined.

From the document Introduction: "With the increasing use of distributed web services and cloud computing, third-party applications require access to server-hosted resources. These resources are usually protected and require authentication using the resource owner's credentials (typically a username and password). In the traditional client-server authentication model, a client accessing a protected resource on a server presents the resource owner's credentials in order to authenticate and gain access.

Resource owners should not be required to share their credentials when granting third-party applications access to their protected resources. They should also have the ability to restrict access to a limited subset of the resources they control, to limit access duration, or to limit access to the methods supported by these resources.... Instead of sharing their credentials with the client, resource owners grant access by authenticating directly with the authorization server which in turn issues a token to the client. The client uses the token (and optional secret) to authenticate with the resource server and gain access.

For example, a web user (resource owner) can grant a printing service (client) access to her protected photos stored at a photo sharing service (resource server), without sharing her username and password with the printing service. Instead, she authenticates directly with the photo sharing service (authorization server) which issues the printing service delegation-specific credentials (token)..."

See also: the IETF Open Authentication Protocol (OAuth) Working Group


W3C Workshop: Community Invited to Discuss Augmented Reality on the Web
Staff, W3C Announcement

A "W3C Workshop: Augmented Reality on the Web" will be held June 15-16, 2010 in Barcelona, Spain. This workshop, Co-located with Mobile 2.0, is closely associated with the MobEA series, now in its eighth year. "Augmented reality (AR) is a term for a live direct or indirect view of a real-world environment whose elements are supplemented with, or augmented by computer generated imagery. The augmentation is conventionally achieved in real time and in a meaningful context with environmental elements.

Augmented reality is a long standing topic in its own right but it has not been developed on the Web platform. Because mobile devices are becoming more powerful and feature-rich, the workshop will explore the possible convergence of augmented reality and the Web.

The objective of this W3C workshop Augmented Reality on the Web is to provide a single forum for researchers and technologists to discuss the state of the art for AR on the Web, particularly the mobile platform, and what role standardization should play for Open Augmented Reality.

To ensure productive discussions, the workshop is limited to 80 attendees. Participation is open to non-W3C members. Each organization can provide at most two attendees. Position papers are required in order to participate in this workshop. Each organization or individual wishing to participate must submit a position paper explaining their interest in the workshop no later than Saturday May 29, 2010..."

See also: W3C Workshops


White House to Agencies: Use One Framework To Exchange Information
Aliya Sternstein, Nextgov

The U.S. White House is requiring federal agencies to consider using a standard configuration developed by the Justice and Homeland Security departments to share information across the public and private sectors. More than a month ago, the Office of Management and Budget issued guidance to agencies on the website of the National Information Exchange Model (NIEM), a joint DOJ-DHS program. The OMB document, which is not posted on its website, includes instructions for assessing the framework's merits by May 01, 2010.

OMB did not make the public aware of such plans to overhaul federal information exchange on its website, raising questions about a lack of transparency, as well as the security of the model, according to privacy advocates. OMB officials noted that the NIEM website is public and pointed out that other OMB requirements such as information security standards for the federal government also are posted on other agency sites.

Some privacy groups still have to review the specifications and therefore could not comment, while others urged the Obama administration to fully disclose security procedures if agencies proceed with NIEM. Security experts familiar with the information technology setup at Justice and DHS praised the integrity of the framework and the idea of rolling it out governmentwide... NIEM launched in 2005 with the goal of linking jurisdictions throughout the country to better respond to crises, including terrorist attacks, natural disasters, large-scale crime and other emergencies handled by Justice and Homeland Security. The standards are intended to expedite the secure exchange of accurate information.

This winter, the Health and Human Services Department announced it will use NIEM as the foundation of a nationwide network for medical professionals to exchange patient data. Some in the health IT community expressed fears that if other agencies are using the same framework as doctors, the government could access private health information. HHS officials have emphasized that harmonizing standards for information exchange will not facilitate the transmission of medical records to law enforcement or intelligence agencies..."

See also: the NIEM technical specifications


Why Cloud Computing Will Never Be Free
Dave Durkee, ACM Queue Distributed Computing

"The competition among cloud providers may drive prices downward, but at what cost? The last time the IT industry delivered outsourced shared-resource computing to the enterprise was with timesharing in the 1980s, when it evolved to a high art, delivering the reliability, performance, and service the enterprise demanded. Today, cloud computing is poised to address the needs of the same market, based on a revolution of new technologies, significant unused computing capacity in corporate data centers, and the development of a highly capable Internet data communications infrastructure...

The difference between expectation and what the industry can deliver at today's near-zero price points represents a challenge, both technical and organizational, that will have to be overcome to ensure large-scale adoption of cloud computing by the enterprise... This is where we come full circle and timesharing is reborn. The same forces are at work that made timesharing a viable option 30 years ago: the high cost of computing (far exceeding the cost of the physical systems) and the highly specialized labor needed to keep it running well. The essential characteristics of cloud computing that address these needs are: (1) On-demand access; (2) Elasticity; (3) Pay per use; (4) Connectivity; (5) Resource pooling; (6) Abstracted infrastructure; (7) Little or no commitment...

By offering value beyond simply providing CPU cycles, the cloud provider is becoming a part of the end customers' business. This requires a level of trust that is commensurate with hiring an employee or outsourcing your operations. Do you know whom you are hiring? This vendor-partner must understand what the enterprise holds important and must be able to operate in a way that will support the cloud end customer's business. By taking on the role of operations services provider to the enterprise, the vendor enables the end customer to gain all of the benefits of cloud computing without the specialized skills needed to run a production data center. It is unrealistic, however, to expect outsourced IT that eliminates the need for in-house staffing to be delivered at today's cloud-computing prices.

For the Cloud 2.0 revolution to take hold, two transformations must occur, which we are already seeing in our sales and marketing activities: cloud vendors must prepare themselves to provide value to the enterprise that entices them out of their purpose-built data centers and proprietary IT departments; and customers must perceive and demand from cloud vendors the combination of fast and reliable cloud computing with operations services that their end users require..."


Enterprise Architecture: Reality over Rhetoric
Scott W. Ambler, DDJ

"For several decades we've heard that effective enterprise architecture programs are a critical success factor for medium-to-large size IT organizations. I have been a promoter of enterprise architecture, both in my writings and working with organizations around the world, yet after all these years it seems that the reality of enterprise architecture is nowhere close to fulfilling some of the rhetoric around it. So I decided to find out what's actually working in practice, and what's not working for that matter, in my January 2010 State of the IT Union Survey.

The survey explored the potential pitfalls leading to the failure of enterprise architecture programs, with business issues and people-oriented issues being common culprits. The top five pitfalls, in order, were providing insufficient time for the enterprise architecture program to succeed, project teams not taking advantage of the enterprise architecture, it's too difficult to measure benefits of the program, the enterprise architects were perceived as 'ivory tower', and development teams couldn't wait for their enterprise architects. Enterprise architecture programs are a long-term investment, granted if you're smart you'll show some tangible results on a regular basis, but the main benefits can take years to materialize.

Technology platforms and strategies are first and foremost the most hyped topics when it comes to enterprise architecture. One of the questions presented a list of platforms and strategies and asked respondents whether their enterprise architecture included them. In order respondents indicated: (1) Service Oriented Architecture (SOA) 65%; (2) Common Frameworks 55%; (3) Business Process Management (BPM) 52%; (4) Components 43%; (5) Software as a Service (SAAS) 37%; (6) Product Line Architecture 31%; (7) Cloud Computing 22%; (8) Semantic Architecture 14% [...] I suspect that several of the platforms—particularly SOA, frameworks, and components—rated highly because they are mature and proven technologies. Cloud computing did surprisingly well considering that it's a relatively new strategy and I suspect its adoption will grow over the next few years. I was surprised that product line architectures rated as highly as it did considering that they require a fair bit of sophistication to implement effectively.

In conclusion, the survey appeared to show that a large percentage of organizations, particularly larger ones, are trying their hand at enterprise architecture. Many of these efforts are doing well, although on average enterprise architecture programs in practice don't seem to be living up to their promises. I hope that this survey has helped to shed some light on the current status of enterprise architecture, and better yet provide some insights for improving your approach..."

See also: the survey summary


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
ISIS Papyrushttp://www.isis-papyrus.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2010-04-26.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org