This issue of XML Daily Newslink is sponsored by:
Microsoft Corporation http://www.microsoft.com
- Locations with Locally-Defined Coordinate Reference Systems for PIDF-LO
- What Kind of Cloud Computing Project Would You Build with $32M?
- Microsoft Eyes Smart Grid With Utility Software
- RFID Tags Chat Their Way to Energy Efficiency
- Stream Control Transmission Protocol (SCTP): What, Why, and How
- IBM Encryption Breakthrough Could Secure Cloud Computing
- W3C Recharters eGovernment Interest Group for Open Government Data
- Development of a SOA Manifesto
Locations with Locally-Defined Coordinate Reference Systems for PIDF-LO
Martin Thomson and James Winterbottom (eds), IETF Internet Draft
Members of the IETF Geographic Location/Privacy (GEOPRIV) Working Group have published an initial -00 Internet Draft for the specification "Locations with Locally-Defined Coordinate Reference Systems for the Presence Information Data Format - Location Object (PIDF-LO)." The XML schema for the indoor location elements, presented in Section 10, also includes a definition of the 2-dimensional and 3-dimensional image-based coordinate systems and units of measure used in definitions of coordinate reference systems. Formal definitions from the GML standard are used.
Summary: "This memo describes a method for constructing a Presence Information Data Format—Location Object (PIDF-LO) document that contains location information using a locally-defined coordinate reference system (CRS). This form of representation allows for use of locally-defined coordinates with potential advantages for improved accuracy and usability in local context, in particular location applications that operate indoors. A framework for defining a local CRS is provided. A process for transformation of coordinates defined in the local CRS and the widely used World Geodetic System 1984 (WGS84) CRS is defined."
Background: "Providing location information in indoor environments presents new sets of technical challenges and use cases for location determination and representation. For use indoors, location information that is in a form specific to that locality can be both more accurate and more usable. The ability to specify relative coordinates simplifies the use of local applications, especially local mapping or navigation applications, which often rely on floor plan images or provide directions based on fixtures of the local environment. Within the confines of a building, or in any local context, location information might be determined in relation to fixtures in that environment... For example, a shopper uses the information contained in a PIDF-LO to identify the location of a store in a mall. The geodetic location information or civic address information (RFC 5139) helps the shopper identify the location of the mall. The relative, or indoor, location representation helps the shopper find the store within the mall. This information can be used together with a map of the mall, providing information in a form that is more readily usable to the shopper. The location of the store or the shopper can be overlaid on the provided map, aiding in finding the store..."
What Kind of Cloud Computing Project Would You Build with $32M?
Michael Cooney, Network World
The US Department of Energy announced that it will spend $32 million on a project that will deploy a large cloud computing test bed with thousands of Intel Nehalem CPU cores and explore the work of commercial cloud offerings from Amazon, Microsoft, and Google. Ultimately the project, known as Magellan, will look at cloud computing as a cost-effective and energy-efficient way for scientists to accelerate discoveries in a variety of disciplines, including analysis of scientific data sets in biology, climate change and physics...
While shared resources are not new to high-end scientific computing, smaller computational problems are often run on departmental Linux clusters with software customized for the science application... Cloud computing centralizes the resources to gain efficiency of scale and permits scientists to scale up to solve larger science problems while still allowing the system software to be configured as needed for individual application requirements. DOE is funding the project with American Recovery and Reinvestment Act money that will be divided equally amongst its Argonne National and Lawrence Berkeley National Laboratories. The combined set of systems will create a cloud testbed that scientists can use for their computations while also testing the effectiveness of cloud computing for their particular research problems..."
Microsoft Eyes Smart Grid With Utility Software
Martin LaMonica, CNET News.com
Microsoft announced "that it has developed an architecture tailored for utility smart-grid programs. The Smart Energy Reference Architecture (SERA) is meant to give utilities a blueprint for integrating and modernizing their IT systems. Microsoft said that its software will work with devices specific to the power industry and help utilities better handle an anticipated wave of real-time data...
Governments around the world are offering billions of dollars to entice utilities to upgrade their electricity distribution networks. These smart-grid programs can take many forms: smart meters that transmit information every few minutes to utilities; sensors on power lines to spot outages; or routers in substations to transmit information back to utilities. In nearly every case, there's a large IT component to smart-grid programs because utilities expect to collect more usage information from customers in order to run their distribution grids more efficiently..."
From the announcement: "The Microsoft SERA for the smart energy ecosystem will help create a world where thousands of smart devices can seamlessly plug into the grid thanks to common standards and interoperability frameworks, just as the plug-and-play model allows thousands of devices to seamlessly plug into PCs today. Consequently, utility industry systems integrators such as Accenture are leading proponents of the Microsoft SERA for smart grids... Significantly, Microsoft has been working closely with key power industry partners to ensure that SERA addresses power utilities' IT infrastructure needs. Alstom Power, for example, has demonstrated its commitment to Microsoft by fully embracing SERA and sees this move as the first step in providing solutions for the new challenge raised by smart grids..."
See also: the Microsoft SERA announcement
RFID Tags Chat Their Way to Energy Efficiency
George Lawton, IEEE News
"Basic RFID technology is widely used to provide unique identifiers for merchandise, pallets, animals, passports, and so on, and to enable contactless payment cards and keys. Now researchers are working on a way make to make RFID technology even smarter by using the network to store computational data... UMass Amherst, and RSA are developing a new architecture for computational RFID (CRFID) technology that securely offloads storage to the network. The development includes Cryptographic Computational Continuation Passing (CCCP), a technique they think will make it more efficient to store program data in the reader rather than the tag for many CRFID applications... CRFID and CCCP could extend the use of RFID into low-maintenance sensors and improve the security of contactless payment systems and ID cards.
The novelty of CCCP is that it's more economical to store intermittent data on the reader between power cycles than on the RFID tag itself. CCCP gives the tag a boost in computational power as a tradeoff for local storage. The one concern about this approach is that an untrusted reader might compromise the data. The use of cryptography helps to ensure data integrity. There is a delicate balance between the amount of power required to encrypt the data and transmit it to the reader for storage, compared with the amount of power required to store it locally. The University of Massachusetts and RSA researchers believe that it's more efficient to transmit data sizes above 64-bytes to the reader for storage..."
Stream Control Transmission Protocol (SCTP): What, Why, and How
Preethi Natarajan, Fred Baker (et al), IEEE Internet Computing
"The Stream Control Transmission Protocol (SCTP) is a general-purpose IETF transport protocol with kernel implementations on various platforms. Similar to TCP, SCTP provides a connection-oriented, reliable, congestion and flow-controlled layer 4 channel. Unlike both TCP and UDP, however, SCTP offers new delivery options that better match diverse applications' needs. SCTP is a connection-oriented protocol supporting message-based transfer. It provides for: reliable data transfer; partially reliable data transfer; ordered data delivery; unordered data delivery; congestion and flow control; protection from spoofed SYN attacks; half-closed connections; multistreaming; multihoming; dynamic address reconfiguration.
SCTP Streams for HTTP Transfers: TCP provides in-order delivery within the byte stream; a lost piece of one HTTP response delays the delivery of successively received other responses until the sender retransmits the lost piece. This problem, known as head-of-line (HOL) blocking, is due to the fact that TCP can't logically separate independent HTTP responses in its transport and delivery mechanisms. When independent HTTP transactions are transmitted over different SCTP streams (within a single association), the SCTP receiver delivers these transactions to the application without inter-transaction HOL blocking. Evaluations using an SCTP-capable Apache Web server and Firefox browser show that SCTP streams eliminate HOL blocking and significantly improve Web response times in end-to-end paths that have high latency and loss rates. Ongoing work includes identifying other SCTP services that are useful for HTTP, standardizing HTTP over SCTP, and merging the SCTP- specific modifications to Apache and Firefox into the corresponding source repositories. A patch for adding SCTP support to Mozilla's portable runtime library is currently available.
SCTP is maintained within the IETF's Transport Area working group (TSVWG). Ongoing standardization efforts within the TSVWG include specifying modular extensions, socket API extensions, and security enhancements for SCTP. Activities at the IETF's Behave working group include specifying network address translation (NAT) behavior for SCTP traffic and SCTP over UDP encapsulation to enable SCTP traversal through middleboxes that don't yet understand SCTP. Kernel-level SCTP implementations are available on various platforms, such as AIX, FreeBSD, HP-UX, Linux, Mac OS X, and Solaris/ OpenSolaris. SCTP is available as a kernel driver for Windows XP and Vista..."
IBM Encryption Breakthrough Could Secure Cloud Computing
R. Colin Johnson, Smarter Technology
"Searching databases is usually done in the clear. And even if the query is encrypted, it has to be decrypted (revealing its contents) before it can be used by a search engine. What's worse is that databases themselves are stored as plaintext, available to anyone gaining access. The smarter way to handle sensitive information would be to encrypt the queries, encrypt the database and search it in its encrypted form. Impossible until now, IBM's T.J. Watson Research Center recently described a 'homomorphic' encryption scheme that allows encrypted data to be searched, sorted and processed without decrypting it...
Even the industry-standard RSA encryption scheme—named after its inventors Rivest, Shamir and Adleman—is partially homomorphic, in that it allows simple multiplication of two encrypted numbers to yield their product. What Gentry and IBM have succeeded in crafting is a fully homomorphic encryption scheme where any mathematical operation can be made to work as expected. Fully homomorphic encryption schemes theoretically allow cyphertext to be manipulated as easily as plaintext, making it perfect for modern cloud computing, where your data is located remotely..."
See also: Homomorphic Encryption
W3C Recharters eGovernment Interest Group for Open Government Data
Staff, W3C Announcement
W3C annnounced that the eGovernment Interest Group has been rechartered with a new focus Open Government Data and Education/Outreach. Initial Chairs include Kevin Novak (American Institute of Architects), John Sheridan (UK National Archives), and Jose M. Alonso (CTIC Foundation).
The mission of the eGovernment Interest Group, part of the W3C eGovernment Activity, "is to document, advocate, coordinate and communicate best practices, solutions and approaches to improve the interface between citizens and government through effective standards-based use of the Web.
The eGov IG is encouraged by the increase in electronic government initiatives, including Data.gov and Recovery.gov in the United States, and several initiatives in Europe building on the momentum of the European Directive on the Re-use of Public Sector Information. Publishing data using open standards can help people efficiently share, combine, and expose government data. Online availability of government data increases accountability, contributes valuable information about the world, and enables governments and countries to function more efficiently. The eGov IG will serve as the W3C conduit to document, validate and communicate standards and best practices that relate to the presentation, availability and interoperability of open government data and data systems. The IG will identify various scenarios in which government data is generated and/or compiled, and will develop and recommend models for OGD management. The eGov IG will work with governments, end users, and other interested parties to develop best practices and approaches to successfully publish government data in open formats...
The group is open to all, W3C Members and non-Members alike. W3C encourages participation from people around the world working on improving the interface between citizens and government. The eGov IG mailing list archives are publicly available and can be read by anyone. If you would like to contribute online to the work of the eGov IG, you may subscribe to the eGov IG mailing list..."
See also: the new eGov IG Charter
Development of a SOA Manifesto
Mark Little, InfoQueue
Steve Ross-Talbot points out that there's some new work going on in defining a SOA Manifesto. Along with Steve the working group consists of people from IBM, Oracle, Red Hat and others. According to the SOA Manifesto pages, the manifesto will be 'A formal declaration of the principles, intentions and ambitions of service-orientation and the service-oriented architectural model'.
As a community we need to arrive at a consensus that helps people understand what service-orientation really means, the principles behind it (business as well as architecture), and what constititutes a SOA—perhaps related to one of the SOA reference models? Although this may seem obvious, as Steve goes on to point out: '[...] in my world I often have to deal with people who equate WS-* to SOA and equate ESB to SOA. Which of course is not the case. They may help and even constrain and SOA but they not make one'..."
Steve: "'Manifesto' is a a bit of craze these days. I noticed that there's a manifesto for cloud and for internet and so on. But oddly enough nothing for SOA. As a participant I started to look into what is out there and I must admit I am really surprised at how little there is for SOA. Of course we have patterns and principles and even governance for SOA but as yet no one has written a manifesto personal or otherwise to share with the industry as a whole. In arriving at some manifesto for SOA which will help focus people on the key aspects of what service-orientation means and what in means to have a service-oriented-architecture there are some key areas that need work..."
See also: the SOA Manifesto web site
XML Daily Newslink and Cover Pages sponsored by:
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: email@example.com
Newsletter unsubscribe: firstname.lastname@example.org
Newsletter help: email@example.com
Cover Pages: http://xml.coverpages.org/