The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
Advanced Search
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

Cover Stories
Articles & Papers
Press Releases

XML Query

XML Applications
General Apps
Government Apps
Academic Apps

Technology and Society
Tech Topics
Related Standards
Last modified: August 24, 2010
XML Daily Newslink. Tuesday, 24 August 2010

A Cover Pages Publication
Provided by OASIS and Sponsor Members
Edited by Robin Cover

This issue of XML Daily Newslink is sponsored by:
ISIS Papyrus

Open Text Launches Semantic Navigation
Staff, Open Text Corporation Announcement

"Open Text Corporation, a preeminent provider of enterprise content management (ECM) software, has announced Open Text Semantic Navigation as an innovative tool that helps audiences naturally navigate through volumes of information based on the inherent meaning of the content and increasing Web marketing and online search effectiveness. Available as a cloud-offering or on-premise, Semantic Navigation gives organizations an easy way to improve engagement with online audiences.

At the core of the offering is the Open Text Content Analytics engine that intelligently extracts meaning, sentiment and context from content, and in turn marries that content to what a customer or prospect is looking for on a website. The result is that audiences more consistently and quickly find helpful, valuable information with much less effort.

A complete solution, Open Text Semantic Navigation is designed to complement any existing Web site, independent of the Web content management system used, either installed on local servers or as an online service provided by Open Text. With the cloud-based offering (currently in beta), organizations can rapidly and inexpensively upgrade their sites user experience using a free, fully functional 30-day trial.

Once the trial is activated, Semantic Navigation first collects content through a crawling process. Then the content is automatically analyzed and tagged with relevant and insightful entities, topics, summaries and sentiments the key to providing an engaging online experience. Next, content is served to users through intuitive navigation widgets that encourage audiences to discover the depth of available information or share it on social networks, such as Facebook and Twitter. From there, Semantic Navigation supports placement of product and service offerings or advertising to convert page views into sales...."

See also: the Open Text Semantic Navigation Cloud

Open Grid Forum Public Review: Network Description Language Translation
Jeroen van der Ham (ed), Informational OGF Technical Report

The Open Grid Forum announced publication of a document Translating from DCN to Network Description Language (NDL) and Back Again, available for public review through September 24, 2010. This document was produced by members of the Open Grid Forum Network Markup Language Working Group (NML-WG). The purpose of the OGF Network Markup Language Working Group is to combine efforts of multiple projects to describe network topologies, so that the outcome is a standardised network description ontology and schema, facilitating interoperability between different projects. Hybrid networks offer end users a mix of traditional connections and new optical services in the form of dedicated lightpaths. These must be requested in advance and are currently configured on demand by the operators. Because lightpaths are circuit switched, the user must be aware of the topology and of the techniques involved in the provisioning. Once connected, they offer a high-speed, low-level connection to the requested destination. The working group is creating an extensible schema to describe computer networks. This schema should provide an abstraction layer for networks, specifically hybrid networks. Such a schema can be used to create inter-domain network graphs at various abstraction levels, to provide an information model for service discovery, and to facilitate lightpath provisioning.

The report examines the translation process from the topology descriptions used by the Dynamic Circuit Network software (DCN) created by the DICE collaboration, and the Network Description Language (NDL) created by the University of Amsterdam. Topology descriptions are a necessary component of inter-domain provisioning in circuit oriented networks. In the past few years several different projects have created provisioning software, each with their own way of describing network topologies. In 2007 the Network Markup Language working group (NML) was started in the Open Grid Forum (OGF) to build on all these efforts and create a standard for topology descriptions. Building a standard is a long and complicated process, especially with several different initiatives involved. The NML schema is progressing, but there are still some open issues that must be discussed...

The topology descriptions used by the DCN have been created in the Network Measurements working group (NMWG) of the OGF, called the NMWG Control Plane Schema. This schema defines a model for describing networking topologies, initially aimed at describing network measurements. They have also defined an XML schema to describe network topologies in XML. In this model a domain topology contains node elements, which contain port elements, which can contain one or more link elements. The link elements contain references to identifiers of other links to describe the connection...

The Network Description Language describes an ontology for describing network topologies in RDF. On the surface the identifiers in both schemes are very similar. DCN uses structured Uniform Resource Names (URNs), and NDL follows the RDF approach and uses Uniform Resource Identifiers (URIs)... NDL uses the RDF approach to identifiers, meaning that the only requirement is that identifiers must be globally unique. A common shorthand in RDF is to use local identifiers starting with a pound sign (#). Any RDF software reading those identifiers will automatically prefix these with the prefix defined in the document, or the location of the file. However it should be noted that in RDF identifiers are only identifiers, there is no information embedded in them. The translation procedure, barring the limitations described above, has been successfully implemented in Python based on the Python NDL Toolkit. It can take an XML DCN topology, and create an RDF NDL translation of it. A translation from NDL to DCN also works for a subset of NDL, all the interfaces of the topology must be on the Ethernet layer, and they should have the capacity properties defined..."

See also: the DCN Translation Project

IETF Internet Draft: The Network Trouble Ticket Data Model
Dimitris Zisiadis, Spyros Kopsidas (et al, eds.), IETF Internet Draft

IETF has published an updated level -04 specification for the Experimental Track Network Trouble Ticket Data Model, which provides an XML representation for conveying incident information across administrative domains between parties that have an operational responsibility of remediation or a watch-and-warning over a defined constituency. The data model encodes information about hosts, networks, and the services running on these systems; attack methodology and associated forensic evidence; impact of the activity; and limited approaches for documenting workflow.

Details: "The Network Trouble Ticket Data Model (NTTDM) aims to simplify TT exchange within the boundaries of a Grid and to enhance the functional cooperation of every Network operation Centre (NOC) and the Grid Operation Centre (GOC). Community adoption of the NTTDM enhances trouble resolution within the grid framework and imparts network status cognisance by modelling collaboration and information exchange among the operators.

Handling multiple sets of network trouble tickets (TTs) originating from different participants inter-connected network environments poses a series of challenges for the involved institutions, Grid is a good example of such multi-domain project. Each of the participants follows different procedures for handling trouble in its domain, according to the local technical and linguistic profile. The TT systems of the participants collect, represent and disseminate TT information in different formats. As a result, management of the daily workload by a central Network Operations Centre (NOC) is a challenge on its own. Normalization of TTs to a common format for presentation and storing at the central NOC is mandatory...

The Network Trouble Ticket Data Model is specified in two ways: as an abstract data model and as an XML Schema. Document Section 3 provides a Unified Modeling Language (UML) model describing the individual classes and their relationship with each other. The semantics of each class are discussed and their attributes explained. In Section 6, this UML model is provided through representation as an XML Schema, together with specific XML namespace..."

See also: the earlier project Enabling Grids for E-science

With Information Sharing, Context Is As Important As Content
Michael Daconta, Government Computer News

"Context, which is the environment or situation surrounding a particular target, is a critical component of federal data architectures that needs to be planned and implemented before an incident occurs in which it is needed. This article examines three information management projects in which context plays a key role in the solution...

Given that modern development platforms can automatically generate code to process XML documents, a narrow perspective can affect the exchange and any code that processes that exchange. The new approach being spearheaded by forward-thinking elements of the Army and Air Force is to create the semantics first, via a high-fidelity data model called an ontology, and then generate the XML schemas from that model.

Although not based on the Web Ontology Language, the National Information Exchange Model (NIEM) takes a similar approach, in which the XML schemas are generated from a database-backed data model. The contextual nature of this approach is that the ontology uses a more top-down, enterprise perspective to guide the inclusion of bottom-up exchanges. The heightened awareness and use of context were mirrored on the commercial front by Google's purchase of Metaweb and the company's Freebase entity graph.

The elevation of context in our information management activities is a sign of a more aggressive attitude toward actively managing our data so that we can take advantage of its potential. The key to mastering context is to understand the role of metadata in your organization and how to effectively design it... metadata captures context, whereas your data is content..."

See also: the National Information Exchange Model

IETF Last Call Review for Mapping YANG to DSDL to Validate NETCONF Data
Ladislav Lhotka, Rohan Mahy, Sharon Chisholm (eds), IETF Internet Draft

The Internet Engineering Steering Group (IESG) has announced a Last Call review of the specification Mapping YANG to Document Schema Definition Languages and Validating NETCONF Content. IESG has received a request from the IETF NETCONF Data Modeling Language (NETMOD) Working Group to consider approval of this document as an IETF Proposed Standard. Please send substantive comments to the IETF mailing lists by 2010-09-07. No IPR declarations were found that appear related to this I-D.

This document "specifies the mapping rules for translating YANG data models into Document Schema Definition Languages (DSDL), a coordinated set of XML schema languages standardized as ISO/IEC 19757. The following DSDL schema languages are addressed by the mapping: RELAX NG, Schematron and Document Schema Renaming Language (DSRL). The mapping takes one or more YANG modules and produces a set of DSDL schemas for a selected target document type—datastore content, NETCONF message etc. Procedures for schema-based validation of such documents are also discussed..."

Background: "The IETF NETCONF Working Group has completed a base protocol used for configuration management. This base specification defines protocol bindings and an XML container syntax for configuration and management operations, but does not include a data modeling language or accompanying rules for how to model configuration and state information carried by NETCONF. The IETF Operations Area has a long tradition of defining data for SNMP Management Information Bases (MIB) modules using the Structure of Management Information (SMI) language to model its data. While this specific modeling approach has a number of well-understood problems, most of the data modeling features provided by SMI are still considered extremely important. Simply modeling the valid syntax without the additional semantic relationships has caused significant interoperability problems in the past.

Since NETCONF uses XML for encoding its messages, it is natural to express the constraints on NETCONF content using standard XML schema languages. For this purpose, the NETMOD WG selected the Document Schema Definition Languages (DSDL) that is being standardized as ISO/IEC 19757 (DSDL). The DSDL framework comprises a set of XML schema languages that address grammar rules, semantic constraints and other data modeling aspects, but also, and more importantly, do it in a coordinated and consistent way. While it is true that some DSDL parts have not been standardized yet and are still work in progress, the three parts that the YANG-to-DSDL mapping relies upon—RELAX NG, Schematron and Document Schema Renaming Language (DSRL)—already have the status of an ISO/IEC International Standard and are supported in a number of software tools..."

See also: DSDL references

Black Duck: Open-Source Projects Drive Cloud Computing
Darryl K. Taft, eWEEK

"Open-source projects are helping to drive adoption of cloud computing, according to a recent study by Black Duck Software, a provider of products and services for automating the management, governance and secure use of open-source software.

Indeed, open-source projects aimed at enabling enterprise IT application development for cloud computing are proliferating, Black Duck said in a press release describing the company's analysis. Cloud computing frameworks and platforms designed to support integration with cloud services, scalability in private and public clouds, and manage and store cloud data are growing rapidly and include well-known open source projects such as Hadoop, Eucalyptus, Hyperic, deltaCloud, OpenStack, and OpenECP.

Black Duck analyzed its proprietary KnowledgeBase of open-source project information to uncover trends in open-source projects for cloud computing. And, although many open-source projects useful for cloud development and deployment don't specifically reference cloud, Black Duck found almost 400 projects that did.

The Black Duck analysis shows a 70 percent growth from 2008 to 2009 in projects specifically associated with cloud computing. These cloud-specific, open-source projects account for nearly 50 million lines of code..."

See also: the Open Source Project Data

XBRL US Labs Launches Brix Project with Release of Brix iPhone App
Staff, XBRL US Announcement

XBRL US Labs, the research and development arm of XBRL US, the national consortium for XML business reporting standards, released the world's first publicly available/free XBRL iPhone app, named Brix. Brix delivers data from, and about, corporate financial statements moments after they are filed to demonstrate the power and benefits of XBRL, and to generate interest in the Brix Project, an initiative to improve the usability of XBRL through crowd source techniques leveraging the collective expertise of a diverse group of business, information architecture, and technology practitioners.

The Brix app uses the power of XBRL to deliver tagged corporate financial reports filed by all publicly-traded companies with the SEC quarterly and annually (e.g. 10-Ks and 10-Qs) to your iPhone, moments after they are filed with the SEC. This is possible because the reports are tagged by the issuer when they are submitted—because they are structured and computer-readable, the data can be extracted and manipulated.

The iphone app allows you to: (1) Search for a company, or for words in the label of any of 17,000 XBRL tags. (2) Build report of corporate filings, starting with the most recent. Select a document (e.g. balance sheet) and view its contents as individually tagged facts, in presentation, value, or date. Email the report as a Microsoft Excel spreadsheet. (3) Find the frequency of tag usage, along with definition and list of companies using that tag. (4) Get alerts for a specific company or tag... The ongoing development goal is to recruit a multi-disciplinary crowd to advance the usability of XBRL, from document architecture, to data models, to interface design and visualization..."

XBRL US is "the non-profit consortium for XML business reporting standards in the U.S. and it represents the business information supply chain. Its mission is to support the implementation of XML business reporting standards through the development of taxonomies for use by U.S. public and private sectors, with a goal of interoperability between sectors, and by promoting XBRL adoption through marketplace collaboration. XBRL US has developed taxonomies for U.S. GAAP, credit rating and mutual fund reporting under contract with the U.S. Securities and Exchange Commission

See also: the Brix XBRL iPhone app

Federal CIOs Issue Cloud Computing Privacy Framework
J. Nicholas Hoover, InformationWeek

"Although cloud computing represents a possible solution to the government's rapidly increasing on-premises storage needs, federal agencies need to be aware of 'significant privacy concerns' associated with storing personally identifiable information in the cloud, the U.S. Federal CIO Council says in a new document outlining a proposed policy framework on privacy and the cloud.

Federal privacy regulations control how and where federal agencies hold and process personally identifiable information, and the CIO Council warns that, without consulting their legal and privacy teams and putting a plan into place, federal agencies may run afoul of those regulations..."

The document Privacy Recommendations for the Use of Cloud Computing by Federal Departments and Agencies was produced by the CIO Council Privacy Committee, Web 2.0/Cloud Computing Subcommittee. Excerpt: "Federal agencies need to be aware of the significant privacy concerns associated with the cloud computing environment where PII will be stored on a server that is not owned or controlled by the Federal government. That solution may result in holding or processing data without complying with Federal privacy requirements in a multi-jurisdictional environment. The framework below provides guidance on the privacy considerations posed by moving computer systems that contain PII to a Cloud Computing Provider (CCP)...

The purpose of this paper, and of privacy interests in general, is not to discourage agencies from using cloud computing; indeed a thoughtfully considered cloud computing solution can enhance privacy and security. Instead, the purpose is to ensure that Federal agencies recognize and consider the privacy rights of individuals, and that agencies identify and address the potential risks when using cloud computing." The 'Summary of the Privacy Risks Posed by Some Cloud Computing Platforms' highlights nine kinds of risks that come into play with a third party—who is not necessarily bound by the same laws and regulations..."

See also: the Privacy Framework document


XML Daily Newslink and Cover Pages sponsored by:

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

XML Daily Newslink:
Newsletter Archive:
Newsletter subscribe:
Newsletter unsubscribe:
Newsletter help:
Cover Pages:

Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation


XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI:  —  Legal stuff
Robin Cover, Editor: