The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: September 07, 2007
XML Daily Newslink. Friday, 07 September 2007

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Primeton http://www.primeton.com



National Information Exchange Model (NIEM) Naming and Design Rules
Webb Roberts and GTRI Staff, Working Draft

On August 7, 2007 members of the U.S. National Information Exchange Model (NIEM) Program Management Office (PMO) released Draft Version 1.2 of the "National Information Exchange Model Naming and Design Rules." The document defines the data model, XML components, and XML data for use with the National Information Exchange Model (NIEM) version 2.0. The NDR guide documents the criteria used in creating NIEM, and better enables users to develop NIEM-conformant exchanges. This is particularly valuable as users develop extension schemas to address needs beyond the scope of NIEM and the various representative domains. The NIEM provides a concrete semantic model, leveraging concepts from W3C XML Schema, RDF and the ISO/IEC Standard 11179 Metadata Registries. This semantic model underlies all NIEM-conformant schemas, as well as NIEM-conformant instance data. XML data that follows the rules of NIEM imply specific meaning. NIEM specifies a set of reusable information components for defining standard information exchange messages, transactions, and documents on a large scale: across multiple communities of interest and lines of business. These reusable components are rendered in XML schemas as type, element and attribute definitions that comply with the W3C XML Schema specification. The W3C XML Schema standard enables information interoperability and sharing by providing a common language for describing data precisely. The constructs it defines are basic metadata building blocks—baseline data types and structural components. Users employ these building blocks to describe their own domain-oriented data semantics and structures. Rules that profile allowable XML Schema constructs and describe how to use them help ensure that those components are consistent and reusable.... The U.S. National Information Exchange Model (NIEM) is a U.S. Federal, State, Local and Tribal interagency initiative providing a foundation for seamless information exchange. NIEM was launched on February 28, 2005, through a partnership agreement between the U.S. Department of Justice (DOJ) and the U.S. Department of Homeland Security (DHS) and signed by Chief Information Officers. Collaborative work is now being done through the leadership of UN/CEFACT ATG2 to harmonize several of the NDR specifications under a common model based upon Version 3 of the UN/CEFACT XML Naming and Design Rules Technical Specification.

See also: UN/CEFACT XML Naming and Design Rules Technical Specification


SOA for Identity Management is Concordia Goal
Rich Seeley, SearchWebServices

Can a service-oriented architecture (SOA) approach to enterprise identity management provide interoperability between Liberty Alliance's SAML 2.0, Microsoft's Windows CardSpace and Verisign Inc.'s OpenID? Roger Sullivan, president of the Liberty Alliance Management Board and vice president of identity management at Oracle Corp., believes the flexibility of the SOA approach will be the key to achieving goals such as single sign-on (SSO) across multiple systems. Under the banner of the Concordia Project supporters and users of the SAML 2.0, CardSpace and OpenID identity specifications came together in June 2007 to listen to what enterprise customers need for interoperability. This week, Concordia announced plans for a second fact-finding meeting on September 26, 2007 at the Digital Identity World (DIDW) conference in San Francisco. Following that meeting Concordia is expected to have use cases and input from major enterprises including Boeing Corp., General Motors Corp. and Chevron Corp., as well as government agencies. The next step will be development of prototypes based on an SOA approach to identity management and interoperability. Sullivan: "These enterprise are dealing with multiple silos of identity information. They have myriad applications they need to grant access to and users who need access to the information. This includes internal employees, external partners, customers, consumers, customer advocacy groups and government agencies. The only way that is going to work going forward is if the industry advocates and builds a service-oriented architecture approach to identity information. In a business world where mergers and acquisitions are common, companies constantly find that a newly acquired division has legacy systems with identity management that doesn't match the rest of the company. In the died-and-gone-to-heaven environment there would be one single sign-on methodology, one protocol that everyone would use, but that's simply not reality. So you need to figure out a way to be flexible and to allow access through an SOA infrastructure, otherwise you're frankly doomed to failure. You're prevented from growing your organization because you are bound and restricted by legacy apps."

See also: the announcement


OWS-5 Initiative Agile Geography: Federated Geo-synchronization Services
Staff, OGC Announcement

The Open Geospatial Consortium (OGC) has published an update for the Request for Quotes/Call for Participation (RFQ/CFP) for the OGC Web Services, Phase 5 (OWS-5) Interoperability Initiative. OWS-5 is a testbed to advance OGC's open geospatial interoperability framework. It is intended to advance standards for geospatial workflow, sensor webs, geospatial digital rights management, GML information communities, and KML. The OWS-5 Agile Geography testbed focuses on process integration and "right-sizing" of services to demonstrate the power of interoperability and service-oriented architectures using OGC Web Services. Agile Geography thread explores this goal through two distinct activities. The first explores the future of KML, OWS Context, and lightweight payloads of geospatial information on the Web in general, applying the concepts of links, bookmarks and Web pages to digital cartography and geospatial information management. The second activity "GeoSynchronization and Sharing" extends the WFS-Transactional architecture to target a federated environment comprised of loosely affiliated parties who desire to collaborate, in full or in part, on the maintenance of a shared geospatial data set. As described in the addendum, in response to sponsor requirements, the OWS-5 initiative's Agile Geography thread has been expanded to include two new work items. The two new work items are titled "Federated Geo-synchronization Services" and "OWS Core + Extensions Experiment". In the Federated Geo-synchronization work, participants will help develop standard approaches to using GML application schemas such as GeoRSS GML and GML Simple Features Level 0 with Atom and the Atom Publishing Protocol. Atom addresses the syndication of Web content such as weblogs and news headlines to Web sites as well as directly to user agents. The "OWS Core + Extensions Experiment" work item involves participants implementing a more formal and modular approach to structuring OGC standards. OWS testbeds are part of OGC's Interoperability Program, a global, hands-on and collaborative prototyping program designed to rapidly develop, test and deliver proven candidate specifications into OGC's Specification Program, where they are formalized for public release. In OGC's Interoperability Initiatives, international teams of technology providers work together to solve specific geoprocessing interoperability problems posed by the Initiative's sponsoring organizations.

See also: OpenGIS Specifications


Service-Oriented Architecture and Enterprise Architecture, Part 3: How Do They Work Together?
Mamdouh Ibrahim and Gil Long, IBM developerWorks

Implementing an SOA in an enterprise where EA has been established or is concurrently being developed is tricky. Potential architecture- and governance-related problems lurk as a result of the overlap in their scope of influence, governance organizations, processes, and architectures. This (part 3) article presents a case study based on experience with a large account for which EA and SOA were developed concurrently. Dealing with issues encountered during this client engagement taught us some valuable lessons about adopting SOA and developing an EA at the same time. The client, a Fortune 500 company, is engaged with IBM for a 10-year contract for business and IT outsourcing services. As part of this engagement, IBM is providing a broad range of business transformation and IT outsourcing services, and managing all of the client's IT operations—mainframe, midrange, desktop, help desk, voice and data network, application development, and maintenance. Key IT transformational activities include the following: (1) The establishment of a framework and center of excellence (CoE) for SOA; (2) Level 3 Software Engineering Institute (SEI) process maturity attainment for application development and maintenance; (3) Application portfolio management; (4) Development of several new transformation projects as well as data center and server consolidation projects. Part 1 of theseries provided definitions and scope of both SOA and EA to establish a framework in which proper comparisons and contrasts between the two can be meaningful. It also explained that SOA and EA are more than just architecture -- specifically, both are comprised of architecture, governance, and a roadmap. You saw a breakdown of the different domains of each and the governance framework for both. Part 2 focused on identifying the similarities and differences between SOA and EA, considering the architecture in both and identifying the overlap between their corresponding domains. It highlights the potential problems that may arise when an enterprise has developed (or is developing) an EA and is now embarking on establishing an SOA.


What Is OPENID For?
Bob Blakley, Blog

The Burton Group's Security and Risk Management Strategies Blog featured a story on OpenID — "an open, decentralized, free framework for user-centric digital identity." Bob Blakely writes: "There's been a bit of a dust-up over OpenID recently in the blogosphere. First Eugene and Vlad Tsyrklevitch published a paper at BlackHat 2007 outlining a bunch of weaknesses in OpenID. What I'd really like to see, as a security guy, is a problem statement and a risk analysis. Specifically, before we start arguing about whether OpenID 2.0 is the answer, I'd like to know [answers to several] questions: (1) What are the assets to be protected? Blog comment lists? Blog entries? Persistent consumer accounts on commercial servers? Persistent employee accounts on corporate servers? (2) What are the services to be offered? Authentication of users as the legitimate possessors of OpenID URLs? Linkage of OpenID URLs to user accounts on web-facing systems? Linkage of OpenID URLs to user attribute information? (3) What quality of protection is claimed for these services? Is the OpenID protocol intended to protect against phishing? Is it intended to protect against man-in-the-middle attacks? Is it intended to protect against attempts by one OpenID party to induce another party to execute malicious code? (4) What is the threat model? Accidental failures at a participating party? Malicious behavior by users? Malicious behavior by relying parties? Malicious behavior by OpenID providers? Wiretappers? Hackers attempting to penetrate a relying party? (5) What is the trust model? Does the user trust the OpenID provider to actually check his password? Does the provider trust the relying party not to send maliciously constructed OpenID URL strings? Does the relying party trust the provider not to reissue OpenID URLs to different parties at different times? Does the relying party trust any particular OpenID provider to issue OpenID URL strings in a particular part of the namespace? All the arguments about OpenID are entertaining, but the claims and counterclaims are very difficult to evaluate in the absence of a coherent problem statement which includes answers to questions like these. The OpenID 2.0 Specification signally fails to address these issues; in this sense it's a solution looking for a problem.

See also: OpenID Authentication 2.0


Web, AJAX Slammed for Deficiencies
Paul Krill, InfoWorld

The Web and AJAX have many deficiencies, including security holes, and much more needs to be done to iron out these problems, according to a keynote speaker at The Rich Web Experience conference in San Jose, California. Douglas Crockford, an architect at Yahoo and creator of JSON (JavaScript Object Notation), gave a mostly gloomy presentation on AJAX (Asynchronous JavaScript and XML) and the Web. His presentation was entitled, "The State of AJAX." Crockford: "The sad thing was the Web was a step backward in terms of interactivity when it debuted... It looked like Java would fix the problem with applets. Unfortunately, Java was a huge failure. It completely collapsed. It didn't meet any of its goals... Java's write-once, run-everywhere promise was not kept; it had an unworkable security model and a tedious UI model. Java did, however, become very successful on the server... HTML raises questions about whether it is a document format or an application delivery format; it has low graphical ability and is missing a compositing model. With AJAX, HTML needs to be an application delivery format; XHTML was supposed to replace HTML, but it died because it was too brittle... This left JavaScript and then XMLHTTP requests for communicating from the browser to the data server.... CSS (Cascading Style Sheets) presents a styling layer in the browser, but it is slow, complex, and incredibly fragile. It surprises me that there is not a greater call for its replacement... XML is complicated and inefficient. Fortunately, XML has been replaced by JSON... This gives me some confidence that we can fix the standards in the Web..."

See also: James Clark on XML and JSON


Geotagging Links Photos to Locales
Stephen Shankland, CNET News.com

An array of maturing technologies is poised to add a new dimension -- geography—to the digital photography revolution. Today, people can retrieve digital photos based on the time they were taken. A nascent technology called geotagging, though, enables people to organize photos by where they were taken, not just when. Today, geotagging is not for the faint of heart. It requires a Global Positioning System (GPS) receiver and either software that adds GPS data to photo files or an expensive camera that communicates directly with the GPS device. But as the technology takes off and sites such as Yahoo's Flickr or Google's Panoramio show off the possibilities, the elements of geotagging are starting to come together. Geotagging doesn't just mean a new way of sifting through a hard drive for a shot you know you took somewhere in the Alps. It also opens up possibilities for virtual tourism and for slide show narration. One sticky point in the geotagging process is the unification of the GPS data with the image files. GPS track logs—which come in a variety of formats—must be transferred from the receiver to the computer. Then software stamps the photos with the GPS location from the time the picture was taken—a process that can be complicated if the camera's time isn't set precisely. Of course, geotagging would all be a lot easier if the location data were written into the image file as a photo was taken. The geotagging today is small, but it will explode in interest once it's automatic instead of a laborious manual process... Camera support today is only at the early stages. Ricoh has developed two GPS-enabled models, the 500SE and Pro G3, both with detachable GPS modules. Nikon's new higher-end D300 and top-end D3 digital SLRs, like their predecessors, can be connected to a GPS receiver with a special cable. And Canon's new higher-end EOS 40D and top-end EOS-1Ds Mark III both have optional wireless transmitters that can be connected to a GPS receiver.

See also: W3C Basic Geo Vocabulary


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
EDShttp://www.eds.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2007-09-07.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org