A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover
This issue of XML Daily Newslink is sponsored by:
Oracle Corporation http://www.oracle.com
Headlines
- OASIS Members Form Privacy Management Reference Model (PMRM) Committee
- Cloud Identity Summit 2010: Report for Day 2
- The Apache Software Foundation Announces Apache FOP Version 1.0
- Using XML and JSON with Android
- A Technology Race to Curb Peak Energy Demand
- Augmented Reality: A Point of Interest for the Web
- Veracode Upgrades Cloud-Based Application Security Testing
- Extensions to the IODEF-Document Class for Reporting Phishing
OASIS Members Form Privacy Management Reference Model (PMRM) Committee
Staff, OASIS Announcement
OASIS has announced the creation of a new Privacy Management Reference Model (PMRM) Technical Committee. Its principal objective is to "develop and articulate a Privacy Management Reference Model that describes a set of broadly-applicable data privacy and security requirements and a set of implementable Services and interactions for fulfilling those requirements. The first meeting of the PMRM TC will be held as a teleconference on September 08, 2010. Institutional member companies approving the TC charter include CA, ISTPA, NIST, American Bar Association, WidePoint Corporation, and Information Card Foundation. The PMRM TC is expected to be of interest to privacy policy makers, privacy and security consultants, auditors, IT systems architects and designers of systems that collect, store, process, use, share, transport across borders, exchange, secure, retain or destroy Personal Information.
The TC will accept as input the existing ISTPA Privacy Management Reference Model v2.0 — a structure for resolving privacy policy requirements into operational controls and implementations — developed by the International Security, Trust and Privacy Alliance (ISTPA). It is anticipated that this document will be contributed to the TC for further elaboration and standardization at OASIS.
Specific goals of the TC are to: (1) Define a set of operationally- focused privacy requirements which can serve as a reference for evaluating options for designing and implementing operational privacy controls. These requirements will constitute a useful working set of 'privacy guidelines', which can both serve as general guidance, and as a feature set against which the PMRM and any implementation can be tested. (2) Define a structured format for describing privacy management Services, and identify categories of functions that may be used in defining and executing the Services. (3) Define a set of privacy management Services to support and implement the privacy requirements at a functional level. These Services will include some capabilities that are typically implicit in privacy practices or principles (such as policy management or interaction), but that are necessary if information systems and processes are to be made privacy configurable and compliant. (4) Establish an explicit relationship between security requirements and supporting security services (such as confidentiality, integrity and availability services) and the privacy management Services.
Motivation: "Today, increased cross-border and cross-policy domain data flows, networked information processing, federated systems, application outsourcing, social networks, ubiquitous devices and cloud computing bring ever significant challenges, risk, and management complexity to privacy management... Typical policy expressions provide little insight into how to actually implement such policies, presenting frustration for policymakers (who expect business systems to manage privacy and security rules) and design challenges for IT architects and solution developers (who have few models to guide their work). An effective solution to privacy and security management and compliance obligations in today's IT-centric, networked systems, services and applications environment would be a collection of privacy and security policy-configurable, IT-based, systematic behaviors that faithfully satisfy the requirements of privacy and security policies within a wide variety of contexts and implementation use-case scenarios... [Therefore] the purpose of the OASIS Privacy Management Reference Model is to aid in the design and implementation of operational privacy and security management systems..."
See also: the OASIS Privacy Management Reference Model (PMRM) TC public web page
Cloud Identity Summit 2010: Report for Day 2
Travis Spencer, Blog
"Day 2 of the Cloud Identity Summit 2010 kicked off w/ Ping Identity's CEO, Andre Durand, discussing the importance of identity and the need for us to come together as a community to discuss it in the context of cloud computing — similarly to what other thought leaders said at RSA. He handed it off to Gunnar Peterson who said that there are four fundamental technologies necessary to enable broad adoption of cloud computing: (1) Security Token Services, i.e., STSs; (2) Policy Enforcement Points (PEPs) and Policy Decision Points (PDPs); (3) Gateways; (4) Monitoring...
I totally agree that STSs are a core component of this new architecture. They will one day be on par w/ DNS, DHCP and other infrastructure services that enterprises need to operate. While this technology helps answer the first fundamental question of who you are, it doesn't address the question that we're actually interesting in knowing the answer to: what are you allowed to do? This is where the PEPs and PDPs come in, and I completely agree that these are critical to the adoption of cloud computing.
Eve Maler picked up on this theme in her talk on User Managed Access (UMA), a protocol for authorization that's being incubated by Kantara. In addition to birthing new standards, this organization, Pamela Dingle explained after Maler's talk, is also a Trust Framework Provider (TFP). This and similar organizations are essentially abstractions around IdPs. The US Government is defining profiles of certain protocols (e.g., Info Card, OpenID, etc.), and stipulating that TFPs must ensure that all IdPs that they vouch for conform to these profiles...The output of these TFPs is metadata which is analogous to a Certificate Revocation List (CRL) in PKI. Because the "CRL" can be traced from the TFPs back up to the US Government, RPs can pick and choose IdPs willy nilly knowing that they are all reputable and capable of asserting someone's identity.
This abstraction would have come in handy during Lee Hammond's talk that he did w/ Brian Kissel. In it, Hammond spoke about how his record label is using Janrain's Engage product (formerly RPX) to shield his Web apps from the assortment of protocols supported by the IdPs he relies on. Using Janrain's identity protocol mediation service, music fans are able to seamlessly login once to the Web sites of multiple musicians on his label..."
See also: the Cloud Identity Summit 2010 Program
The Apache Software Foundation Announces Apache FOP Version 1.0
Staff, ASF Announcement
"Apache FOP (Formatting Objects Processor) Version 1.0, the Open Source XSL Formatting Objects Processor, is a redesigned, stable version of the pioneering XSL formatting objects processor that rounds out the Apache XML software stack. The Apache Software Foundation (ASF) — an all-volunteer developers, stewards, and incubators of nearly 150 Open Source projects and initiatives — has announced the immediate availability of Version 1.0 release of Apache FOP.
An Apache project since 1999, FOP (Formatting Objects Processor) is one of the industry's first print formatters driven by W3C-standard XSL Formatting Objects created to display, convert, and print to formats such as PDF, PostScript, SVG, RTF, and XML. In addition, FOP is among the most commonly-used output- independent formatters. Jeremias Märki, member of the Apache XML Graphics Project Management Committee notes: 'FOP v.1.0 provides a good subset of the W3C XSL-FO 1.0/1.1 specification. Its stable, 1.0 designation provides added recognition as the productive tool it has been for years; the redesign and improved features in the layout engine makes it an even better experience for the many developers and users who produce millions of pages each year'...
Apache FOP is in use at Accenture, Airbus, Australia Post, BNP Paribas, Capgemini, Credit Suisse, CSC, Denic, European Patent Office, FedEx, Ford, HP, IBM, IntelliData, Marriot International, Morgan Stanley, Polaris, Siemens, Swiss Federal Institute of Intellectual Property, Tecra, US Army, US House of Representatives, and Wyona, among many others. In addition, FOP is the default implementation bundled in XML editors such as XSLfast, Oxygen, and XMLSpy.
The release of FOP v.1.0 completes a free XML software stack, comprising: Apache Xerces, Apache Xalan, and Apache FOP. The ability to to insert graphics into one's print output is possible using Apache Batik. The Apache XML stack makes transforming and formatting XML data (for example DocBook XML) a viable option for individual and start-up users without business cash flow..."
See also: the Apache Apache FOP Project web site
Using XML and JSON with Android
Frank Ableson, IBM developerWorks
Mobile devices and platforms boast more features and functionality with each new release, and often mere months separate significant announcements from the leading mobile vendors. The headlines are mostly about UI features and hardware enhancements such as processor speed and storage capacity. But the crucial fact remains that content is king.
Content (or, more generally, data) is exchanged constantly among applications, servers, mobile devices, and users. Without being able to work with it, smartphones such as Apple's iPhone and Google's Android simply become overpriced and underperforming cell phones.
Consider the phenomenal success of social-networking platforms such as Facebook, LinkedIn, and Twitter. From a pure feature-and-function perspective, these platforms are largely pedestrian. They are popular because members and site visitors derive value from the content published there. And that content is accessed increasingly by mobile devices.
This article demonstrates the use of XML and JSON data-interchange formats on the Android platform. The source of the data for the example application is a status-update feed for a Twitter account. The feed data is available from Twitter in both XML and JSON formats. As you see, the programming approach to manipulating the data varies significantly between the two formats... Compared to the JSON approach, the XML approach is somewhat faster and less memory-constrained—at the expense of additional complexity. In Part 2, I'll introduce some advanced techniques combining JSON data, WebKit-based WebView widgets, and custom dynamic application logic for Android applications..."
A Technology Race to Curb Peak Energy Demand
Martin LaMonica, CNET News.com
"Summertime ... high heat, leading to heavy air conditioning loads, puts a strain on grid operators' ability to keep electricity flowing to the grid, as happened earlier this month in the Northeast region of the United States. Lux analyst Ted Sullivan says demand response is the absolute winner when you look at low costs, but it's really only limited to one or two percent capacity.
This recent report from Lux Research says this peak demand time has given an entry point for three types of technologies to curb electricity usage: demand response, solar, and storage...
Demand response systems curtail electricity usage primarily at commercial and industrial facilities. By temporarily adjusting the temperature or lighting in a retail story or factory, demand-response providers lighten the load on the grid for a few hours. The method is cost-effective but is limited to about 100 hours a year...
Centralized solar plants have a higher capacity factor and their output tends to coincide with hot, sunny days. However, without storage, solar becomes difficult to justify because centralized plants often require stand-by power generation and construction of transmission lines... Meanwhile, utility-scale storage, particularly with batteries, remains very expensive and projects are often tricky to advance under current utility regulations, which are now being reconsidered..."
Augmented Reality: A Point of Interest for the Web
Phil Archer, Blog
A recent workshop Augmented Reality on the Web in Barcelona has sparked a good deal of debate within and around W3C. As the final report shows, the workshop brought together many different companies and organizations working on or with a direct interest in the field of 'Augmented Reality'...
One outcome is clear: we need a method for representing data about points of interest and proposals are advancing to achieve this in a new POI Working Group. Quite what data needs to be represented concerning Points of Interest depends on who you ask. For some it's a question of annotating a given point on the Earth's surface where the longitude, latitude and altitude are all key identifiers. For others it's more a question of the point at a given distance and angle from an object that may or may not be static as seen by an observer who may themselves also be moving.
Different communities are involved here: as well as the augmented reality community, the linked data community has a keen interest. There are other facets to the discussion too and this is what will make the POI working group's work interesting...
The workshop also recommended that a new POI WG should go further and consider the wider picture of how AR does, or might, relate to the Web. Privacy is a major concern; device APIs are critical enablers; do CSS and SVG have sufficient power to support AR functions? Even the use of HTTP as a transport mechanism is questioned by some given the real time nature of AR..."
See also: the Workshop Final Report
Veracode Upgrades Cloud-Based Application Security Testing
Vance McCarthy, Integration Developer News
Veracode Inc. is making it even easier for devs to bring cloud-based security testing into their software development lifecycle. The company is integrating its SecurityReview application security testing SaaS service directly into Java, .Net, C/C++, ColdFusion and PHP development environments...
SecurityReview's approach integrates security practices more directly into the software development lifecycle, enabling devs to verify an application's security before deployment in a simple and cost-effective way. The service offers unlimited scans on any number of internal applications, lowering cost and complexity and expanding reach of security. SecurityReview is available as a subscription service from the cloud, and offers features like: (1) Application Portfolio Dashboard with a centralized view of risk and security information to better manage the overall security review and testing process; (2) Automated Code Review (binary static analysis) which reviews the final integrated application, including libraries and third party components for the most accurate detection of commonly occurring security vulnerabilities; (3) Dynamic Analysism which supports automated web vulnerability scanning; (4) Access to Open Source Ratings Database; (5) Executive, Security and Developer Reports which provide summaries and detailed reports to support the activities of security offices, engineering managers and developers..."
According to the Veracode announcement: "Developers can now upload applications automatically and download line-of-code specific vulnerability identification and remediation instructions directly to defect tracking systems and integrated development environments (IDEs). Results are often 100 percent lower in false positives than alternative on-premise source code tools. By delivering the benefits of cloud-based static binary and dynamic web application testing to local development environments, Veracode makes accurate, reliable application security testing accessible to all developers, not just security experts...
In some cases, developers have been handed down expensive, complicated on-premise security tools with high false positive rates that require them to be security tool experts and waste precious cycles finding the real problems. Alternatively, they receive manual testing results reports from third-parties that are disconnected from their development processes. Veracode's pragmatic approach to cloud-based security testing and training, which is integrated into local development environments, enables developers to focus on writing secure code on time and within budget..."
See also: Veracode SecurityReview
Extensions to the IODEF-Document Class for Reporting Phishing
Announcement, IETF Staff
The IETF RFC Editor Team has announced the publication of the Standards Track specification Extensions to the IODEF-Document Class for Reporting Phishing as an approved IETF Proposed Standard, Request for Comments RFC 5901. This document "extends the Incident Object Description Exchange Format (IODEF) defined in RFC 5070 to support the reporting of phishing events, which is a particular type of fraud. These extensions are flexible enough to support information gleaned from activities throughout the entire electronic fraud cycle—from receipt of the phishing lure to the disablement of the collection site. Both simple reporting and complete forensic reporting are possible, as is consolidating multiple incidents. Appendix A presents the Phishing Extensions XML Schema corresponding to Fraud Report XML Representation format... Elements, attributes, and parameters defined in the base IODEF specification were used whenever possible in the definition of the PhraudReport XML element...
Deception activities, such as receiving an email purportedly from a bank requesting you to confirm your account information, are an expanding attack type on the Internet. The terms phishing and fraud are used interchangeably in this document to characterize broadly-launched social engineering attacks in which an electronic identity is misrepresented in an attempt to trick individuals into revealing their personal credentials ( e.g., passwords, account numbers, personal information, ATM PINs, etc.). A successful phishing attack on an individual allows the phisher (i.e., the attacker) to exploit the individual's credentials for financial or other gain. Phishing attacks have morphed from directed email messages from alleged financial institutions to more sophisticated lures that may also include malware.
To combat the rise in malicious activity on the Internet, service providers and investigative agencies are sharing more and more network and event data in a coordinated effort to identify perpetrators and compromised accounts, coordinate responses, and prosecute attackers. As the number of data-sharing parties increases, the number of party-specific tools, formats, and definitions multiply rapidly until they overwhelm the investigative and coordination abilities of those parties.
By using a common format, it becomes easier for an organization to engage in this coordination as well as correlation of information from multiple data sources or products into a cohesive view. As the number of data sources increases, a common format becomes even more important, since multiple tools would be needed to interpret the different sources of data. A big win in a common format is the ability to automate many of the analysis tasks and significantly speed up the response and prosecution activities..."
See also: references on the Incident Object Description and Exchange Format (IODEF)
Sponsors
XML Daily Newslink and Cover Pages sponsored by:
IBM Corporation | http://www.ibm.com |
ISIS Papyrus | http://www.isis-papyrus.com |
Microsoft Corporation | http://www.microsoft.com |
Oracle Corporation | http://www.oracle.com |
Primeton | http://www.primeton.com |
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/