The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: April 27, 2007
XML Daily Newslink. Friday, 27 April 2007

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
Sun Microsystems, Inc. http://sun.com



W3C Announces New Service Modeling Language (SML) Working Group
Staff, W3C Announcement

W3C has announced the launch of its Service Modeling Language (SML) Working Group as part of the Extensible Markup Language (XML) Activity. John Arwe (IBM) and Pratul Dublish (Microsoft) will chair the Working Group, which is chartered to produce W3C Recommendations for SML, adding extensions to the W3C XML Schema language for inter-document references and user-defined constraints. The first face-to-face meeting will be 11-13 June 2007 in Redmond, Washington, USA, hosted by Microsoft. This combination of inter-document references and user-defined constraints is very useful in building complex multi-document models that capture structure, constraints, and relationships. In the management domain, these models are typically used to automate configuration, deployment, monitoring, capacity planning, change verification, desired configuration management, root-cause analysis for faults, etc. The facilities defined by this Working Group are expected to be of general use with arbitrary XML vocabularies, but the first major use of SML will be to model the structure, relationships, and constraints for complex information technology services and systems. Several common and domain-specific models have been built using the Member Submission version of SML, and many more are under development. Further, several products and services based on SML are expected to ship in near future. In addition, SML is relevant to other standardization efforts that need SML expression of models. To meet these immediate needs, Service Modeling Language should be standardized in a timely fashion. Therefore, this Working Group shall be schedule-driven and the W3C Recommendation for SML shall remain compatible to the extent possible with the existing SML models. This charter features an aggressive schedule and a tightly constrained scope designed to ensure that the SML Working Group will meet its schedule.

See also: the recent news story


Human Readable Resource Identifiers
Norman Walsh and Richard Tobin (eds), IETF Internet Draft

The syntactic constraints of IRIs (RFC 3987) and URIs (RFC 3986) mandate that certain common punctuation characters (such as spaces, quotation marks, and various sorts of delimiters) must be percent encoded. However, it is often inconvenient for authors to encode these characters. Historically, XML system identifiers and, more generally, the value of XML attributes that are intended to contain IRIs or URIs have allowed authors to provide values that use these characters literally. Several XML-related specifications use strings which are interpreted as IRIs, but which allow the use of characters which must be escaped in a legal IRI, such as delimiters and a few ASCII characters. Examples include XML System Identifiers, the 'href' attribute in XLink, and XML Base attributes. These specifications all describe, with slightly different wording, the same algorithm for converting that string to an IRI. The purpose of this RFC is to provide a single definition which can be referenced by these specifications, and to provide a name for strings of this type... The memo therefore defines Human Readable Resource Identifiers, strings which are interpreted as IRIs, but which allow the use of characters which must be escaped in a legal IRI, such as delimiters and a few other ASCII characters. A Human Readable Resource Identifier is a sequence of Unicode characters that can be converted into an IRI by the application of a few simple encoding rules. Internationalized Resource Identifiers (IRIs) extend URIs by allowing unescaped non-ascii characters. Human Readable Resource Identifiers go further by allowing various ASCII characters that are illegal in both URIs and IRIs. By escaping these characters Human Readable Resource Identifiers can be converted to IRIs, which can in turn be converted to URIs if required. [Note: Work in this I-D was done initially in the W3C XML Core Working Group.]

See also: the 'xml-core-wg thread


NIH Updates Grants Submission Client
Joab Jackson, Government Computer News

The National Institutes of Health has upgraded its client for electronically checking grant information by way of using the Electronic Business Extensible Markup Language (EbXML), according to a posting on the EbXML Forum blog. The NIH Office of Extramural Research upgraded the client, which is designed to allow external systems interact with NIH's eReceipts Exchange servers through a Web service interface. This version should use less memory and run more quickly. The new S2Sclient uses version 5 of the open source Apache Tomcat application server software and requires version 1.5 of the Java Development Kit. In February 2007, NIH started requiring all major grant proposals be submitted electronically. The shift marks a major milestone in NIH's transition to receive all grant applications electronically. NIH began with the electronic submission of Small Business Innovation Research applications last December [2006]. Since that time, NIH has received more than 18,000 unique grant applications. The transition to electronic submission requires that two systems with their own registration and validation processes work together. Those are Grants.gov, the government's single online portal to find and apply for federal funding, and eRA Commons, the system that allows applicants to interact electronically with NIH. Organizations using forms-based submission will rely on the PureEdge forms viewer provided free of charge by Grants.gov; organizations desiring a systems-to-systems approach can work with Grants.gov to develop their own data exchange system (XML datastream). NIH Web Services: NIH offers web services for querying the status of grant applications, verifying person information details, updating person information details and requesting validation response messages. The eRA eXchange is the system for the transfer of grant applications and other grant-related data. The eXchange enables authorized grantee institution or service-provider systems to transmit grant applications as Extensible Markup Language (XML) data streams through Grants.gov to NIH. Likewise, the eXchange allows NIH to send XML-based data streams to these service providers and institutions. The eXchange uses Simple Object Access Protocol (SOAP) with attachments (SwA) over Hypertext Transfer Protocol over Secure Socket Layer (HTTPS).

See also: e-applications


Why E-Books Are Bound to Fail
Mike Elgan, Computerworld

E-books, those flat electronic tablets designed for reading downloadable, software-based books, are often packed with advanced displays and other leading-edge technology. Every time a new e-book comes out, a ripple of chatter spreads through the gadget enthusiast community. Technology news sites cover such product and research announcements like major news, similar to the announcement of a new iPod or smart phone. Engadget and Gizmodo blog them without fail. Even The New York Times tech columnist David Pogue and The Wall Street Journal tech columnist Walt Mossberg have taken the time to test and review e-books. Companies like Sony, Panasonic, Hitachi and Fujitsu have devoted millions of dollars over the past couple of decades developing what they hope will be a device that replaces the paper book—the first disruptive shift in the way people read books since the Gutenberg Bible in the 15th century. A lineup of the major e-books (almost) on the market: Sony Reader; eRead StarEBook; Jinke Electronics HanLin eBook; iRex iLiad; Panasonic Words Gear; Bookeen Cybook; Hitachi Albirey; Fujitsu Flepia. Unfortunately, these products, as well as the whole product category, are destined for failure. They're expensive. The hardware costs hundreds of dollars. Worse, books tend not to be hugely discounted in electronic form: [one example] costs $11.20 on Amazon.com, while the same book in electronic format on eBooks.com costs $9.95, so you save $1.25. Another huge barrier to the growth of the e-book market is that everyone already has alternatives: you can read written content on your PC; in fact, you're doing it right now, on tablet PCs, laptops, cell phones and PDAs. [Note: Open XML-delivery formats exist, but that doesn't make the e-book products open. Elgan might have elaborated on the DRM problem: you can share a paper-print book you purchased with anyone, or re-sell it.]


How To Navigate a Sea of SOA Standards
Bob Violino, CIO Magazine

While the potential benefits of SOA are clear, like the ability to reuse existing assets, the standards picture looks anything but settled. Not only did Forrester Research count some 115 standards floating around SOA and Web services in its most recent study on that topic, but also, it found that just confirming which vendors support which standards is nearly impossible. Yet CIOs must press ahead with SOA projects in order to meet business needs. Hong Zhang, director and chief architect of IT Architectures and Standards at General Motors, has been balancing the standards dilemma with ongoing SOA work for several years. For its part, General Motors learned in its early SOA efforts to identify which standards were most important to what the company was trying to achieve. For GM today, the most important specs are those that help standardize the interfaces among services across the well-defined service layers (presentation, business process and so on) The next most important are those that help standardize the implementation of the services within each of the service layers. As part of developing its enterprisewide SOA strategy, the company is identifying the SOA standards around which of its needs are mature, which should be monitored and which are mandatory. Among these, GM is looking at WS-I Basic Profile 1.1 for enterprisewide interoperability. After this, the company will be able to make a well-informed decision about which vendors and products to use in its broad rollout of SOA. Another SOA adopter, TD Banknorth, has taken a strategy of prioritizing standards adopted by vendors recognized as market leaders in the SOA space (for example, webMethods) and standards recognized by several key standards organizations.


SAXing up the Markup Validator: from Validator to Conformance Checker
Olivier Thereaux, W3C QA News and Articles

The next version of the W3C Markup Validator is gearing up for an upcoming release, via a two-weeks or more beta test period. This new version has improvements in pretty much every area: reliability of validation results, new features, speed improvements, better UI... Many of the improvements and changes have been made possible by using a number of new or updated tools and libraries... Validation, roughly speaking, is the process of comparing a document written in a certain language against a machine-readable grammar for that language. So, when the validator checks a document written in HTML 4.01 Strict, it doesn't actually know any of the prose that one can find in the HTML 4.01 Specification, it just knows the machine-readable grammar (called a DTD in the case of HTML, and most markup languages standardized to date). In some ways, that is a good thing: prose can be ambiguous, a DTD is not. But there are some things you can not define, or enforce, with a DTD: for example, attribute values are defined as being of a certain type (identifier, URI, character data), but their value itself can not be enforced with a DTD. As long as the validator remains a validator stricto sensu, validation will be one of its main limitations... What if the validator knew about the prose in the HTML specifications? Would it not solve the problem? It would help. Of course, the validator would no longer be a validator, but instead entering the realm of conformance checkers. The latest version of the validator can now spot documents in XHTML missing the xmlns attribute for the root html element. The technical side of the issue is now closed, but most of the problems are on the road ahead: (1) The wardens of a certain definition of "validator" will probably not be pleased by such a blatant drifting from formal validation into conformance checking. (2) Some users of the validator will be puzzled to see their once-validating documents now rejected by the validator. It is a natural reaction, particularly from users who tend to consider the validator as a 'reference', forgetful that any software may have bugs, ignoring the too-often-seen note that 'the validator's XML support has some limitations' One way to please everyone may be to only issue warnings, not errors...

See also: Validator 0.8.0 upgrade notes


The Open Sourcing of FLEX
Kurt Cagle, O'Reilly Opinion

Adobe announced a few days ago that they would be open sourcing the FLEX API and framework... Adobe and Microsoft have long been engaged in a quiet cold war that has, at its base, control of the way that information is presented - how documents are laid out and fonts are displayed, how vector graphics work in two, three and four dimensions (assuming time as the fourth), how we build user interfaces for everything from game programming to advertisements to forms. Adobe, Microsoft and the W3C have each established differing approaches to this problem of presentation, the first two by creating proprietary standards and technologies on top of them, the last by creating open standards and encouraging the use of these standards by others to build the technologies. The Open-sourcing of Flex represents a fairly dramatic shift in this particular struggle, one which will likely have ramifications lasting for years. The catalyst was the rebranding of XAML/Avalon, first as Windows Presentation Framework (WPF) and now as the newly rechristened 'Silverlight'. Silverlight has its problems, but it is undeniably powerful, and the combination of XML technology and the support for rich graphics places it squarely in the middle of the presentation space. Moreover, with a formal name comes a greater focus and more money for marketing, and I suspect that the light is finally coming on not just at the product team level but throughout the company that Microsoft cannot lose this fight. With release of the Flex API, Adobe essentially provide a universal framework for vector graphics that works on any platform, not just on Windows. It means that I as a programmer can build applications that are highly performant and work across platforms, can do so from a Linux box or a Mac laptop, and that can work well with my XML data streams. This is about more than just pretty vector graphics; this determines the toolkits that developers will end up using for all of their applications, knowing that they can work just as readily within a browser as within a standalone application... I'd like to say that I think this will be a good thing for SVG, but I cannot figure out how.


IBM Calls for New SOA Registry Standard
Joe McKendrick, ZDNet Blog

IBM spokespeople are saying that the UDDI standard for registries isn't cutting it, and the "time is now" for a new registry standard more focused on today's SOA realities. In the meantime, IBM will be offering a proprietary solution. In a new report in ITWeek, IBM managers state that SOAs have stretched the Universal Description, Discovery and Integration (UDDI) web services standard to the limit, and that it's time for a new standard. Burton Group's Anne Thomas Manes had just issued a report that IBM's WebSphere Service Registry and Repository (WSRR) 6.0.1 doesn't fully support UDDI, the commonly accepted standard behind SOA registries. IBM, however, says that UDDI was originally designed for Web services, which invoke point-to-point connections across the network. (In fact, it was designed to be the "Yellow Pages" of the e-business world.) But what enterprises need now is a registry standard that addresses the building-block, enterprise approach of SOA, Big Blue says. SOAs require different information about services than do Web services, IBM claimed. According to Sunil Murthy, a manager for WebSphere Service Registry and Repository at IBM's Software Group, UDDI will not allow for role-based access to services, does not let companies manage a service's life cycle to enable governance, and does not allow for services to be searched. The IBM representatives quoted in the article could not predict what a new registry standard would or should look like, but said vendors should take their time in sorting things out.

See also: Burton


Selected from the Cover Pages, by Robin Cover

First W3C Working Draft for Mathematical Markup Language (MathML) Version 3.0

W3C has announced the publication of a First Public Working Draft the Mathematical Markup Language (MathML 3.0). The Working Group was re-chartered to enhance MathML to better support internationalization of mathematics, accessibility, semantic encoding of mathematics, Unicode alignment, and precise control of rendering for print publishing. MathML is an XML application for encoding both mathematical notation and semantic structure of mathematical content. The goal of MathML is to enable mathematics to be served, received, and processed on the World Wide Web, just as HTML has enabled this functionality for text. According to the specification abstract, MathML can be used to encode both mathematical notation and mathematical content. About thirty-five (35) of the MathML tags describe abstract notational structures, while another about one hundred and seventy provide a way of unambiguously specifying the intended meaning of an expression. Additional chapters discuss how the MathML content and presentation elements interact, and how MathML renderers might be implemented and should interact with browsers. The WD document addresses the issue of special characters used for mathematics, their handling in MathML, their presence in Unicode, and their relation to fonts. While MathML is human-readable in all but the simplest cases, authors use equation editors, conversion programs, and other specialized software tools to generate MathML.


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2007-04-27.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org