The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: October 20, 2010
XML Daily Newslink. Wednesday, 20 October 2010

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
ISIS Papyrus http://www.isis-papyrus.com



Privacy Considerations for Internet Protocols
Bernard Aboba, John Morris, Jon Peterson, Hannes Tschofenig; IETF Internet Draft

An initial version -00 IETF Internet Draft has been published for Privacy Considerations for Internet Protocols. This document "aims to make protocol designers aware of the privacy-related design choices and offers guidance for writing privacy considerations in IETF documents. Similiar to other design aspects the IETF influence on the actual deployment is limited. We discuss these limitations but are convinced that protocol architects have indeed a role to play in a privacy friendly design by making more conscious decision, and by documenting those."

Details: "The IETF at large produces specifications that typically fall into the following categories: (1) Process specifications, e.g., WG shepherding guidelines, where these documents aim to document and to improve the work style within the IETF; (2) Building blocks, e.g. cryptographic algorithms, MIME types registrations, typically are meant to be used with other protocols one or several communication paradigms. (3) Architectural descriptions; (4) Best current practices; (5) Policy statements. Often, the architectural description is compiled some time after the deployment has long been ongoing and therefore those who implement and those who deploy have to make their own determination of which protocols they would like to glue together to a complete system. This type of work style has the advantage that protocol designers are encouraged to write their specifications in a flexible way so that they can be used in multiple contexts with different deployment scenarios without a huge amount of interdependency between the components...

This work style has an important consequence for the scope of privacy work in the IETF, namely, the standardization work focuses on those parts where interoperability is really essentially rather than describing a specific instantiation of an architecture and therefore leaving a lot of choices for deployments. Application internal functionality, such as API, and details about databases are outside the scope of the IETF, and regulatory requirements of different juristictions are not part of the IETF work either...

[So] while HTTP, XMPP, Email, and OAuth are IETF specifications they only define how the protocol behavior on the wire looks like. They certainly have an architectural spirit that has enormous impact on the protocol mechanisms and the set of specifications that are required. However, IETF specifications would not go into details of how the user has to register, what type of data he has to provide to this social networking site, how long transaction data is kept, how requirements for lawful intercept are met, how authorization policies are designed to let users know more about data they share with other Internet services, how the user's data is secured against authorized access, whether the HTTP communication exchange between the browser and the social networking site is using TLS or not, what data is uploaded by the user, how the privacy policy of the social networking site should look like, etc..."


Apps.gov Site to Add Storage, Virtualization and Web Hosting Services
Rutrell Yasin, Application Development Trends

U.S. government agencies "will soon have access to storage, virtualization and Web hosting applications via Apps.gov, the General Service Administration's cloud-based storefront... Vendors that are awarded an infrastructure-as-a-service contract will provide federal, state, local and tribal agencies with cloud storage, virtual machines and Web hosting to support an ongoing expansion of cloud computing capabilities across government."

According to the GSA announcement: "Federal Chief Information Officer Vivek Kundra [said] 'offering IaaS on Apps.gov makes sense for the federal government and for the American people. Cloud computing services help to deliver on this Administration's commitment to provide better value for the American taxpayer by making government more efficient. Cloud solutions not only help to lower the cost of government operations, they also drive innovation across government'...

Each year, the government spends tens of billions of dollars on IT products and services, with a heavy focus on maintaining current infrastructure needs and demands. A major element of every federal agency's IT infrastructure includes storage, computing power, and website hosting. These new Cloud Infrastructure offerings can be a way for agencies to realize cost savings, efficiencies, and modernization without having to expend capital resources expanding their existing infrastructure.

On Apps.gov, IaaS offerings will include on-demand self-service that allows government entities to utilize and discontinue use of products when and as needed. Resource pooling for practically unlimited storage and automatic monitoring of resource utilization are also features. IaaS offerings will also be provided with rapid elasticity for real-time, customizable scaling of service and automatic provisioning of virtual machines, storage, and bandwidth, and visibility into service usage and order management through measured services... Awarded vendors have assembled skilled teams that will support the development of quality services for government agencies. Awarded vendors and their associated teams include [list follows].."

See also: the GSA announcement text


How to Inventory Sensitive Data Within Databases and Spreadsheets
Shahid Shah, IBM developerWorks

"This article describes some methods to automatically identify and inventory PII, PHI, and other sensitive data with databases and spreadsheets using Java technology and the Apache Ant build tool. Using the techniques outlined in this article, you can see that it is relatively simple to create working code that can inventory databases and spreadsheets so you are aware of which databases need private protection and attention.

Identity theft is already a well-known problem in the finance industry, and companies are trying to take steps to counter it as best they can. Medical-related fraud is nascent at this time—currently less than 5 percent of reported breaches occurring within healthcare—but growth in healthcare fraud will continue to accelerate as government stimulus funds expand the use of electronic health records.

When breaches for identity and medical theft occur, they do so because most information systems that have value to organizations contain some level of personally identifiable information (PII) or protected health information (PHI). PII and PHI are the most private kinds of data stored about people and, if they are breached or stolen, they cause adverse events like identity theft or medical fraud. For example, the Department of Defense, Department of Veterans' affairs, handlers of PII and PHI (such as an insurance or personal investment company), and employers (such as a hotel chain) have all reported significant losses of PII and PHI data, in some cases up to 25 million records...

Protecting large enterprise databases is straightforward, but you must know where the data is. The difficult task is how to handle the dozens, perhaps hundreds, of Lotus Notes documents and Microsoft Word, Access, and Excel files. While one could treat these documents as simple file management problems they are more than that, they are real applications and real databases. Any reasonable analysis of HIPAA privacy concerns related to patient information in a sizable organization could uncover thousands of Microsoft Word, Access, and Excel files with health data needing protection..."


Policy Considerations for Emergency Calling Using Voice Over IP
Richard Barnes, Bernard Aboba (et al, eds), IETF Internet Draft

A first public IETF Internet Draft has been published for Policy Considerations for Emergency Calling Using Voice Over IP. Abstract: "The provision of emergency calling services (e.g., 911, 112) has been a critical component in the regulation of telecommunications networks. The technical architectures used by modern Voice-over-IP (VoIP) systems mean that if telecommunications regulators wish to extend emergency calling requirements to VoIP, it will likely be necessary to reconsider the ways in which such requirements are applied, both in terms of what specific mandates are imposed and which entities are subject to them. This document dicusses the fundamental technical requirements for emergency services, how these requirements can be met within the framework of VoIP, and how these solutions approaches create possibilities and limitations for regulatory involvement.

The IP-based emergency services access architectures differentiate a few components that have different responsibilities for offering the complete end-to-end functionality. These roles are: (1) Internet Service Provider - ISP' (2) Voice Service Provider (VSP), or in a more generic form Application Service Provider - ASP; (3) Emergency Service Provider (that operates a PSAP) and vendors of equipment for those; (4) End Device... The emergency call interaction on a high level takes place as follows: the user enters an emergency services number (or potentially an emergency dial string). The end device recognises the entered sequence of digits as an emergency call attempt and determines whether location information is available locally (as part of the GPS module, for example). If no location information is available locally then various protocol extensions have been defined that allow the end host to obtain location information from a Location Information Server in the ISPs network. Then, the call setup procedure using SIP is started towards the users VSP/ASP. The VSP/ASP needs to make a route decision to determine where the call needs to be forwarded to. Often, this decision is based on a combination of the callers location information as well as other policy aspects (such as time-of-day, workload of a specific PSAP)...

In most countries, national and sometimes regional telecommunications regulators have a strong influence on how emergency services are provided; such as who pays for them and what obligations the various parties have. Regulation is, however, still at an early stage: in most countries current requirements only demand manual update of location information by the VoIP user. The ability to obtain location information automatically is, however, crucial for reliable emergency service operation, and required for nomadic and mobile devices.

The separation between Internet access and application providers on the Internet is one of the most important differences to existing circuit switched telephony networks. A side effect of this separation is the increased speed of innovation at the application layer and the number of new communication mechanisms is steadily increasing. Many emergency service organizations have recognized this trend and advocated for the use of new communication mechanisms, including video, real-time text, and instant messaging, to offer improved emergency calling support for citizens. Again, this requires regulators to re-think the distribution of responsibilities, funding and liability..."

See also: the IETF Emergency Context Resolution with Internet Technologies (ECRIT) Working Group


Accelerating the Media Business with MPEG Extensible Middleware
Christian Timmerer, Filippo Chiariglione (et al), IEEE MultiMedia

The development of successful multimedia applications is becoming a challenging task due to short deployment cycles and the huge amount of applications flooding the market. One major problem that the multimedia industry is facing in this area is the heterogeneity of the content-delivery chain. The classic model in which all the components, such as the authoring, distribution, and playback, were known in advance, has now evolved into an interconnected network of programmable processing units that are no longer dedicated and built for one purpose. While there are many advantages to this classic model, there are new challenges with the introduction of a heterogeneous content chain, including the proliferation of data formats for different kinds of media delivered over the Internet...

A key issue when defining a platform is the portability to other platforms, a concept that calls for middleware to be used in a platform-independent way. One example of such middleware is MXM or MPEG-M, a standard designed to promote the extended use of digital-media content through increased interoperability and accelerated development of components, solutions, and applications. MXM introduces the notion of MXM devices, applications, and engines.

The MXM architecture comprises a set of MXM engines for which APIs are defined and on top of which applications can be developed. The current list of MXM engines includes functionalities for content creation, search, adaptation, streaming and delivery, domain management, intellectual property management and protection (IPMP), rights expression, licensing, metadata, event reporting, security, and so on. The orchestrator engine has a special role in providing access to higher-level functionalities by implementing common scenarios using various MXM engines in a predefined way (for example, adaptation of content according to the usage context). An MXM application can make use of as many or as few engines as required to support the intended functionality. To enable interoperable communication between MXM devices, the standard defines the MXM protocols; only the payload format, which is XML-based, is specified here. This format can actually be delivered using any suitable transport protocol, such as HTML or SOAP...

SmartRM, a viral service based on social networks, enables people to share confidential content in a protected way. The SmartRM software lets users convert confidential files (such as PDFs, videos, or audio tracks) into encrypted MPEG-21 files... Exchanging MXM-protocol messages over SOAP and XMPP does most of the communication between the SmartRM client and the SmartRM server. Both the client and the server are then relieved from the complexity of generating, dispatching, and interpreting such messages, as MXM engines can do these operations..."

See also: MPEG Extensible Middleware


Nokia Focusing on Qt and HTML5 for Application Development
Lance Whitney, CNET News.com

"Nokia is making Qt its sole application development framework and is supporting HTML5 -- both of which the company says are designed to benefit mobile app developers and customers. The mobile phone maker reported that its Qt decision means that mobile apps will be compatible with future versions of Symbian and MeeGo. Previously, developers could write directly for Symbian or MeeGo, each with its own specific development tools and environment."

From the announcement: "Nokia is focusing on Qt as a robust, tried and tested framework that unlocks the hardware, software and service capabilities of the existing Nokia smartphone range as well as creating huge opportunities for future Symbian and MeeGo products. Nokia's introduction of Qt Quick into the Qt framework enables the more rapid creation of rich user interfaces and the most visually engaging applications. In addition, Qt's in-built support for HTML5 complements Nokia's intent to support HTML5 in Web browsers.

The decision to focus on Qt as the sole application development framework will ensure that applications will continue to be compatible with future evolutions of Symbian as well as upcoming MeeGo products. In addition, Nokia announces its intent to support HTML5 for development of Web content and applications for both Symbian and MeeGo platforms. To demonstrate its commitment to the new offering, Nokia will develop its own future applications using Qt for a more consistent experience and better integration of applications and services.

The Qt WebKit Integration is an integration of WebKit with Qt. It provides an HTML browser engine that makes it easy to run web apps or embed web content into Qt applications. It supports modern web standards for excellent web compliance and compelling user experiences..."

See also: the Nokia announcement


Google to Post Ancient Dead Sea Scrolls Online
Sharon Gaudin, ComputerWorld

"Google is working with the Israel Antiquities Authority (IAA) to digitize the Dead Sea Scrolls and make them available online. The collection of documents consists of about 30,000 fragments of the scrolls, which are approximately 2,000 years old. Google's job is to take the fragile Dead Sea Scroll documents, some of which were made of dried animal skins and stretch as much as 30 feet, and make them accessible to anyone in the world with online access.

This is the first time that the collection has been photographed in its entirety since the 1950's, according to the Israel Antiquities Authority, an Israeli governmental authority tasked with regulating excavation and promoting research.

Many experts consider the unearthing of the Dead Sea Scrolls to be the most significant archaeological find of the 20th century. Discovered in a series of caves near the Dead Sea in the 1940s and 1950s, the documents are important to many in both religious and historical terms. The scrolls, which include the earliest known copies of the Hebrew Bible, were written between the second century B.C. and the first century A.D. Religious scholars say they provide significant takes on the history of Judaism and early Christianity...

To digitize the scrolls, the IAA will use imaging technology developed by MegaVision, a U.S. based company. The imaging technology uses various wavelengths and infrared light to not only image but help preserve the fragile scrolls. The imaging technology is expected to be installed in the IAA labs early in 2011..."

See also: National Geographic


Sponsors

XML Daily Newslink and Cover Pages sponsored by:

IBM Corporationhttp://www.ibm.com
ISIS Papyrushttp://www.isis-papyrus.com
Microsoft Corporationhttp://www.microsoft.com
Oracle Corporationhttp://www.oracle.com
Primetonhttp://www.primeton.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2010-10-20.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org