This issue of XML Daily Newslink is sponsored by:
IBM Corporation http://www.ibm.com
- vCard Format Extension: To Represent the Social Network Information of an Individual
- Cloud Computing Versus Grid Computing
- Highways Agency Core Components Analysis: A Process for the Harmonisation of Data Concepts
- Lack of Standards Could Stymie Smart Grid
- Using E4X on the Server-Side with Jaxer
- Oracle Expands Enterprise Manager 10g Middleware Duties
- MODUS: Minimum Open Documents Using Standards
vCard Format Extension: To Represent the Social Network Information of an Individual
Robins George and Alexey Melnikov (eds), IETF Internet Draft
This initial version -00 draft document defines extension for the vCard data format for representing and exchanging a variety of social network information of an individual. Social network has become common. Well organized social network information allows the vCard owner to import or preferably subscribe profile information from any existing profile when he/she joins a new network. Data portability remains, one of the incentives behind the development of this format. The Social Network Properties discussed in this memo include: (1) OpenID: OpenID is an open, decentralized user identification standard, allowing users to log onto many services with the same digital identity. Value type: A single URI value. An OpenID is in the form of a URL, it eliminates the need for multiple usernames across different websites, and simplifying online experience. (2) PersonalProfileDocument: A personal profile document page Value type: A single URI value. The PersonalProfileDocument represents those things that are a Document, and that use to describe properties of the person who is the maker of the document. There is just one Person described in the document, i.e., the person who made it and who will be its primary topic. It is used to make a PersonalProfileDocument and made available through the Web. (3) accountServiceHomepage: Indicates a homepage of the service provide for this online account. Value type: A single URI value. (4) depiction: A depiction of some thing Value type: A single value. The default is binary value. It can also be reset to uri value. A common use of depiction (and depicts) is to indicate the contents of a digital image, for example the people or objects represented in an online photo gallery. The basic notion of 'depiction' could also be extended to deal with multimedia content (video clips, audio), (5) geekcode: Description about person Value type: text. The Geeks specification provides a some what frivolous and willfully obscure mechanism for characterizing technical expertise, interests and habits. (6) interest: The main interest of the person in the social networking Value type: One or more text values separated by a COMMA character. The purpose of person to join in a social network may include, finding business partners, friends etc. (7) topicInterest: A thing of interest to this person. Value type: One or more text values separated by a COMMA character. The purpose is to link a person to some thing that is a topic of their interests (rather than, per interest to a page that is about such a topic). (8) onlineChatAccount: An online chat account. Value type: One or more URI values. onlineChatAccount is an online account devoted to chat/instant messaging...
Cloud Computing Versus Grid Computing
Judith Myerson, IBM developerWorks
To get cloud computing to work, you need three things: thin clients (or clients with a thick-thin switch), grid computing, and utility computing. Grid computing links disparate computers to form one large infrastructure, harnessing unused resources. Utility computing is paying for what you use on shared servers like you pay for a public utility (such as electricity, gas, and so on). With grid computing, you can provision computing resources as a utility that can be turned on or off. Cloud computing goes one step further with on-demand resource provisioning. This eliminates over-provisioning when used with utility pricing. It also removes the need to over-provision in order to meet the demands of millions of users. A consumer can get service from a full computer infrastructure through the Internet. This type of service is called Infrastructure as a Service (IaaS). Internet-based services such as storage and databases are part of the IaaS. Other types of services on the Internet are Platform as a Service (PaaS) and Software as a Service (SaaS). PaaS offers full or partial application development that users can access, while SaaS provides a complete turnkey application, such as Enterprise Resource Management through the Internet... The IaaS divides into two types of usage: public and private. Amazon EC2 uses public server pools in the infrastructure cloud. A more private cloud service uses groups of public or private server pools from an internal corporate data center. You can use both types to develop software within the environment of the corporate data center, and, with EC2, temporarily extend resources at low cost—say for testing purposes. The mix may provide a faster way of developing applications and services with shorter development and testing cycles. With EC2, customers create their own Amazon Machine Images (AMIs) containing an operating system, applications, and data, and they control how many instances of each AMI run at any given time. Customers pay for the instance-hours (and bandwidth) they use, adding computing resources at peak times and removing them when they are no longer needed. The EC2, Simple Storage Service (S3), and other Amazon offerings scale up to deliver services over the Internet in massive capacities to millions of users... This article helps you plan ahead for working with cloud by knowing how cloud computing compares to grid computing, how you can resolve issues in cloud and grid computing, and what security issues exist with data recovery and managing private keys in a pay-on-demand environment. Potential consumers' demands for increased capacities over the Internet present a challenge for the developers and other members of a project team. Being aware of and resolving the issues of Web application design and potential security issues can make your team's experiences trouble-free.
Highways Agency Core Components Analysis: A Process for the Harmonisation of Data Concepts
Ian Cornwell and Alastair Dunsmore, White Paper for Review
A posting from Andrew Schoka includes a review document "Highways Agency Core Components Analysis: A Process for the Harmonisation of Data Concepts" - Data Registries Technical Note 233480/TN/16 (March 19, 2008). The poster "welcomes any opportunity to engage in a discussion of this paper with the idea of providing the author feedback as well as possibly applying it to UBL." The paper presents the Highways Agency's Core Components Analysis process as developed by Mott MacDonald and the Highways Agency to encourage harmonisation of data standards, specifications and system interfaces. In the past decades, many systems have been developed in related and overlapping areas. Due to the range of varying requirements and developer preferences, and a lack of standards, there is much diversity in the methods of data documentation and data representation. The result is the duplication of development and data collection, wasting development costs and losing the efficiency of data interoperability and reuse. Harmonisation in this context is the process that increases the alignment of data definitions across related systems, leading to benefits in reuse, in interoperability and in development costs... UN/CEFACT has published the "Core Components Technical Specification", part of the ebXML Framework. The key idea that supports harmonisation is the separation of 'core components', which have no specific business context, from 'business information entities', which apply in specific business contexts... The UK Highways Agency has also derived a process of Core Components Analysis to encourage harmonisation of data concepts. This was initially developed independently from the UN/CEFACT TBG 17 guidance, but produced many similarities to that guidance. However, it differs in scope and in detail from the process applied by UN/CEFACT. The UN/CEFACT process aims to ensure global interoperability. The Highways Agency approach is more focussed on incremental improvements to legacy systems and specifications. The process uses an extended ISO 14817 metadata registry implementation..."
From Schoka note: "ISO Technical Committee 204, Intelligent Transport Systems (ITS), recently published a Technical Report entitled 'Intelligent transport systems — Systems architecture — Harmonization of ITS data concepts' as ISO TR25100. The scope of the TR is the harmonization of data concepts that are being managed by data registries and data dictionaries such as those described in ISO 14817:2002, a domain-specific implementation of ISO 11179. The foundation data definition approach is ISO 15000-5, Core Component Technical Specification (CCTS). The TR compares and contrasts a number of approaches to data harmonization including that of UN/CEFACT TBG17 however, it makes a distinction between inventing core components from requirements and discerning core components from existing systems. A key activity is the ongoing work being done by the UK Highways Agency and their supporting contractor, Mott MacDonald. I had a recent opportunity to interact with a principal investigator of that work and he is soliciting feedback regarding his fundamental approach to data harmonization and whether it should be extended to reflect more detailed formulations of how data elements from different projects can be related. That work would be reflected in an update to ISO TR25100. I have attached his seminal document of core component analysis and invite you to review it. I believe that some of the concepts presented in the paper could well offer benefits to how independent UML implementations could be harmonized amongst themselves as well as with the UN CEFACT Core Component Library..."
See also: the posting to the UBL TC list
Lack of Standards Could Stymie Smart Grid
Stephanie Condon, CNET News.com
The Department of Energy will be pushing out $4.5 billion for smart grid investments as part of the federal government's economic stimulus plan, but unless smart-grid standards are developed quickly, the government risks wasting its money on soon-obsolete technologies that could be incompatible with one another, regulators and industry representatives warned Congress Tuesday. Integrating information technology into the nation's electric grids could enable consumers to monitor and reduce their electric usage and help electric companies locate and respond to power outages, among other benefits, said Fred Butler, a commissioner on the New Jersey Board of Public Utilities. However, Butler told the Senate Energy and Natural Resources Committee, "if we do not do this correctly, we come in danger of not coming even close to meeting those aspirations." Congress may want to consider withholding money for smart grid demonstration projects or the matching grants promised in the stimulus package until fuller standards are put in place [or] Congress may also want to consider taking action to expand the federal government's authority to enforce smart grid standards... Successfully integrating interoperable smart-grid technology into the electric grid will require standards on a number of issues, including security, reliability, data sharing, and privacy. Standards could be developed for a number of other facets of the smart grid as well, such as charging standards for electric hybrid vehicles and open architecture standards... There are so many standards to consider, said Patrick Gallagher, deputy director of the National Institute of Standards and Technology (NIST), that his organization's primary responsibility is simply prioritizing the order in which standards should be developed. "What's desperately needed is an overall roadmap by which we can decide which standards affect regulatory concerns or technical challenges and need to be addressed right away," he said.
Using E4X on the Server-Side with Jaxer
Michael Galpin, IBM developerWorks
Oracle Expands Enterprise Manager 10g Middleware Duties
Charles Babcock, InformationWeek
Oracle has upgraded its Enterprise Manager 10g software to include more dashboard-like abilities. The software can manage Oracle's JD Edwards, Siebel Systems, PeopleSoft, and other applications and now reaches down into its middleware infrastructure, including WebLogic Server, Oracle's lead application server formerly supplied by BEA Systems... In addition to WebLogic and Fusion middleware, Enterprise Manager can now monitor and manage several additional Oracle products, including Oracle Enterprise Service Bus, its message transformation and connection system; Oracle Coherence, its pooled, in-memory data management system; Oracle Beehive, its collaborative communications and conferencing product; and Oracle BPEL Process Manager, its business process management product based on Business Process Execution Language. Release 5 extends Enterprise Manager's ability to support SOA operations by monitoring the Oracle BPEL Process Manager. It reports on instances of executing BPEL and the speed with which it executed, offering a tool for improving service delivery. Enterprise Manager can deploy the Oracle Enterprise Service Bus and manage the deployment of services to run on it. It provides configuration management for the service bus. Release 5 integrates configuration information from WebLogic Server, Enterprise Service Bus, and BPEL Process Manager, making it easier to resolve configuration issues across and SOA environment... Release 5 allows a manager to administer Beehive's collaborative services individually or as a group. Its management pack has the same auto-discovering, monitoring, analysis, and reporting features as other parts of Enterprise Manager. But it can offer a view of Beehive services from either the centralized manager's approach or the end user's perspective, such as a the response time seen by an individual user seeking a Beehive service... The Coherence pack added to Release 5 allows a manager to run a large set of servers using Coherence as a cluster. It includes auto-discovery of Coherence operations, monitoring, reporting, and events management. The Release 5 upgrade also includes a new management pack for Oracle Communications Billing and Revenue Management that can be used to reduce the cost of running the application.
See also: BPEL references
MODUS: Minimum Open Documents Using Standards
Rick Jelliffe, O'Reilly Technical
The issue of document conformance is of course one of the core and perennial tasks that any standards group deals with. The issue only goes away when the standard dies. And each time there is a new group of stakeholders, the issue may need to be revisited, tweaked or augmented. That is just the nature of the business. Conformance is hard. ISO standards have a constraint that only "verifiable" statements can be made in normative text: no airy fairy fluff. And I certainly belong to the camp that says that the clauses in IT standards (in particular document standards) should not only be "verifiable" but that they should be objectively and automatically verifiable in standard ways. In other words, standards should limit themselves to constraints that can be expressed in schemas, as much as possible, and that schema languages therefore need to be smart enough to cope with the kinds of constraints that do in fact crop up with documents. Hence Schematron: indeed, as I have previously written here, it is even possible to write Schematron schemas that can be converted to ISO standard text... I wrote a piece recently Conformance classes should mirror stakeholder usage clusters, and have been tracking and commenting on various issues with ODF and OOXML conformance recently. It seems to me that there is a piece missing. It applies to ODF, OOXML and other standard formats. Rather than describe the problem in much more detail, I think it would be better just to give my first stab at a solution. It certainly has constraints which I think are at a higher level than we can expect schema languages to validate. Hence: MODUS - Minimum Open Document Using Standards. A document is a MODUS document when the following constraints are all true: (1) Only international standard container format (OPC, JFIF, ODF's); (2) Only standard formats used for parts; (3) Only namespaces and values defined in public documentation; (4) All data defined by the standard should be available at least in eponymous standard form -- alternative standard formats are allowed but no extensions or nonstandard formats; (5) All metadata defined by the standard should be available at least in eponymous standard form- alternatives possible and extensions allowed; (6) All data and metadata static — calculated values cached; (7) Data and metadata should be represented in the most direct way available according to the standard, with reasonable leeway; (8) Appropriate use of accessibility, security and internationalization features To put this negatively: No non-standard formats; No non-standard parts; No undocumented or proprietary formats, parts, elements, attributes, values, functions; No data only available in non-standard format; No metadata covered by the standard only available in non-standard format; No dynamic or external data or metadata; No obfuscated or convoluted code; No data that unnecessarily creates disabilities, insecurity or disadvantage...
XML Daily Newslink and Cover Pages sponsored by:
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: email@example.com
Newsletter unsubscribe: firstname.lastname@example.org
Newsletter help: email@example.com
Cover Pages: http://xml.coverpages.org/