The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Last modified: June 04, 2007
XML Daily Newslink. Monday, 04 June 2007

A Cover Pages Publication http://xml.coverpages.org/
Provided by OASIS and Sponsor Members
Edited by Robin Cover


This issue of XML Daily Newslink is sponsored by:
BEA Systems, Inc. http://www.bea.com



XML Joins The Force
Trudy Walsh, Government Computer News

Florida is home to about 400 law enforcement agencies, including city, county and university police departments, sheriffs and district attorneys. All of them have their own budgets, computer-aided dispatch systems and record management systems. What Florida's law enforcement community needed was to get better control of its metadata. The Florida Department of Law Enforcement (FDLE) established eight metadata planners, one for each of the state's seven regions plus one for the Department of Corrections. FDLE decided to take the plunge into the National Information Exchange Model (NIEM), an interagency framework for sharing information using Extensible Markup Language (XML), an open standard that allows exchange of information regardless of computer systems or platforms. NIEM builds on the Global Justice XML Data Model, the law enforcement data standard developed by the Justice Department. Florida is the first state to use a relational version of NIEM, Phillips said. A Sypherlink tool is helping Florida develop a statewide data dictionary and a central data warehouse. This has enabled FDLE to do predictive analysis and complicated analytics. [NIEM note: "The fundamental building block of NIEM is the data component. Data components are the basic business data elements that represent real-world objects and concepts. Information exchanged between agencies can be broken down into individual components—for example, information about people, places, material things, and events. Sources of data components include data models, databases, data dictionaries, schemas, and exchanges. In NIEM, these objects and constructs are represented using XML Schema for the purpose of consistent definition and transmission of infrmation exchange packages (IEPs). The model, however, is independent of any particular technology and in the future could be depicted in any number of representations."]

See also: the NIEM web site


Draft Approach to Data Classification Moving Forward
Jason Miller, Federal Computer Week

The Homeland Security Department is circulating a draft approach to tagging information in a consistent way across all sectors of government. While DHS works on this approach, the Justice Department last week issued Version 2.0 of the National Information Exchange Model. NIEM will enable cross-domain information sharing so the sender and receiver understand the information in the same way. Kshemendra Paul, DOJ's chief architect, said now that Version 2.0 is available, the goal now is to implement it nationwide: "We are seeing an uptick across the state and local governments and the federal environment; we've gone through a lot of change and we are working to generate value. We want the adopters to get business value, which means greater degree of interoperability and building an information exchange at a lower cost." Paul cited several examples of states just beginning to incorporate NIEM. He said Florida has implemented NIEM at its statewide and regional fusion centers, while New York state police use it across the Criminal Justice Information Sharing network.


Compound Information Objects: The OAI-ORE Perspective
Carl Lagoze and Herbert Van de Sompel, OAI Technical Report

Compound information objects are aggregations of distinct information units that when combined form a logical whole, for example, a multi-page web document with an HTML table of contents that points to multiple interlinked HTML individual pages. Information systems such as repository and content management systems provide architectural support for storage of, identification of, and access to compound objects and their aggregated information units, or components. In most systems, the components of an object may vary according to semantic type (article, book, video, dataset, etc.), media type (text, image, audio, video, mixed), and media format (PDF, XML, MP3, etc.). Information systems can leverage the Web architecture to publish the components of a compound object and thereby make them available to web clients and services. But due to the absence of commonly accepted standards, the notion of an identified compound object with a distinct boundary and typed relationships among its component resources is lost. We propose the use of named graphs as a mechanism to address this problem. We describe the concept of the publication of named graphs, issues related to named graph publishing, and the manner in which named graphs can be discovered by web agents and clients. Building on this foundation, the work of OAI-ORE can proceed to specify various modeling issues related to named graphs for compound objects. In addition, various serialization syntaxes need to be evaluated and prototypes. Initially we may choose single serialization syntax for prototype work, and then expand later to multiple syntaxes. Named graph discovery must be further detailed, and bootstrap vocabularies for relationship typing and node typing need to be defined. Individual communities can then specialize the specifications according to their needs. These specializations include (a) development of a variety of vocabularies for expressing types of links between resources denoted by the nodes in a named graph; (b) development of a variety of vocabularies for expressing properties of resources denoted by the nodes in a named graph, especially semantic type, media type, and media format.

See also: OAI Object Reuse and Exchange


CAM: A Swiss Army Knife for XML Structures
Daniel Rubio, TechTarget.com

Web services' bread and butter consists primarily of exchanging XML structures between applications in order to fulfill interoperability requirements. Prior to such exchanges taking place, most XML data undergoes a series of steps that include validation and transformation, many of which are solved using staple XML approaches like Schemas, DTDs, XPath, and XSL. Up next we will explore yet another technique developed by OASIS which nicely complements many of the aforementioned procedures, its name: CAM (Content Assembly Mechanism). CAM's primary objective is to define, compose and validate XML content through the use of specialized templates which allow the application of contextual business rules to any XML structure. Before we dig into what actually constitutes these contextual business rules and why they are useful, the first thing you should realize about CAM is that its rooted in the same principles as many other validating and transformation techniques you may already be familiar with—such as using XPath and structures similar to those of Schemas. So in this sense CAM is an easily digestible approach which doesn't require starting a fresh new learning curve. So what is a contextual business rule for XML? Its an elaborate piece of logic which can be applied to raw XML data prior to being used in an actual Web service or application. Such business rules can in turn be used to enforce a particular XML structure (validate) or transform XML fragments to fit a predetermined form... [Note: OASIS recently announced that the Content Assembly Mechanism Specification Version 1.1 has been approved as an OASIS Standard.]

See also: the CAM CS version balloted


Microsoft Orcas Beta 1 Hints at Killer IDE on the Horizon
Martin Heller, InfoWorld

While the Visual Studio team at Microsoft has been burning the midnight oil for some 18 months to bring us Orcas Beta 1, the CLR (Common Language Runtime) team has been hammering away on .Net Framework 3.5 Beta 1. Happily, all the effort appears to be paying off. Microsoft has three major goals for Orcas: improve developer productivity; manage application lifecycles through TFS (Team Foundation Server); and employ the latest technologies—not just improved support for .Net Framework 3.5, but also WPF (Windows Presentation Foundation), ASP.Net AJAX, and Silverlight. One major improvement is a new design surface for WPF applications. This surface uses the familiar drag, drop, and set properties paradigm, but improves on past designers by displaying the XAML source simultaneously with the graphical design pane. Other designer improvements are a revamped Web designer with better CSS support, and an updated C++ designer that supports the Vista look and feel as well as the thousands of new Vista APIs. The inclusion of Visual Studio Tools for Office in the core product is welcome, but more of a packaging decision than a technical improvement. On the other hand, the Orcas multitargeting facility is something I've wanted for years: I might finally be able to delete my old Visual Studio versions and develop for all versions of the .Net Framework from one IDE. I haven't been able to crash Orcas Beta 1. Download: upward of 5GB of material.


Standard Web Services Stack Remains Illusive SOA Goal
Rich Seeley, SearchWebServices.com

While some vendors say they would like to see everyone agree on a single Web services stack—the protocols used to define, locate, implement and make Web services interact—it does not appear likely to happen. Even advocates of open source Apache Axis 2.0 Web services stack, which is now part of IBM WebSphere, don't expect all the vendors to settle on one standard stack. Paul Fremantle, co-founder and vice president of technology at Open Source Web services startup WSO2, a member the Apache Foundation and an evangelist for Axis, said, 'I don't think it will become a standard.' Bradley F. Shimmin, principal analyst of application infrastructure at Current Analysis LLC, agreed that standardization on a single Web services stack is unlikely given competing stacks from different vendors and the heterogeneous environments of most customers. 'I don't think that will ever happen. I don't see how it could happen. It's like assuming that software will never get versioned.' As co-chair the OASIS Technical Committee standardizing Web Services Reliable Messaging, Fremantle supports standards, but still sees pluses in the reality that there is no Web services stack standard. While Axis 2.0 runs on WebSphere, as well as WebLogic from BEA Systems Inc., and Apache's own Tomcat, and has demonstrated interoperability with Microsoft .NET, Fremantle notes that BEA and JBoss, the division of Red Hat Inc., have chosen to develop their own Web services stacks. BEA offers SALT 1.1, a native Tuxedo Web service stack built on an open-standard SOAP implementation. JBossWS is a JAX-WS compliant Web services stack developed to be part of JBoss' Java EE5 support...

See also: Reliable Messaging


EXI Performance Testing Framework Now Available for Download
WG Members, W3C Testing Framework

Members of W3C's Efficient XML Interchange Working Group have released a framework for evaluating properties of alternate XML formats. The Working Group was chartered to define an alternative encoding of the XML Information Set, that addressed the requirements identified by the XML Binary Characterization Working Group, while maintaining the existing interoperability between XML applications and XML specifications. The EXI framework is a testing framework developed by the W3C EXI working group for the purpose of obtaining empirical data about format properties. In this release, the framework can be used to measure Processing Efficiency and Compactness of several XML and binary XML candidates. The download includes over 100 documents ranging from a few bytes to several megabytes and covering over 20 different schemas, taken from the over ten thousand samples used by the Working Group for their own measurements. Results and analysis from this framework for eight candidate binary XML formats are to be published by the EXI Working Group in July 2007. The EXI framework is built on top of another framework called Japex, which provides basic functionality for drawing charts, generating XML and HTML reports, etc. Note that neither the candidates (except for JAXP which is part of the Java Runtime System) nor Japex are part of this distribution. See the Downloading Dependencies section for more information on how to download additional software components. The EXI framework currently includes drivers for several Java and C/C++ candidates submitted to the EXI WG. The Java drivers use the SAX API, the C/C++ drivers use either a SAX-like API or a typed API (data binding). The framework is defined to run even if some of these candidates are not available on your system.

See also: the WG Charter


Ajax-Powered Google Maps Mashup Tutorial
Peter Laird, BEA Dev2Dev Tutorial

A new era of Web development is in full swing, and it is called Web 2.0. This has ushered in a new set of prototypical Web applications, including blogs, wikis, and mashups. Mashups are the focus of this tutorial, and you will see how a sample mashup can be built using a common set of technologies. This set includes JavaScript, Ajax, REST, JSON, and the Google Maps API. As a Web developer, it is important to understand how these tools fit together. A mashup is created when several data sources and services are "mashed up" (combined) to create something new, or add value in some way. In this tutorial I use these tools to easily build what is the ultimate Hello World mashup: a Google mashup. The tutorial shows how to create a Google Maps mashup combining the mapping data supplied by Google Maps with a location data service that you can build yourself. I combine HTML, JavaScript, XMLHttpRequest, the Google API, and JSON to create a working example of a Hello World mashup. While this can be overwhelming the first time through, you will quickly become comfortable with this collection of technologies. Web 2.0 is a major trend in Web application development, and this pattern of creating a mashup can be applied as you work on Web 2.0 projects. To narrow the scope of this tutorial, I look only at the JavaScript approach when building the mashup. JavaScript is no toy: it is a powerful client-side programming language that has certainly come of age. Cross-browser support has improved dramatically as standards have evolved, making it a viable choice. A powerful event mechanism is included to enable JavaScript to respond to user interactions in the browser.


An Introduction to Haskell, Part 1: Why Haskell
Adam Turoff, O'Reilly ONLamp.com

If you are a professional programmer, then Haskell is in your future. In 1987, this statement would have been equally true about Smalltalk. Today, 20 years later, object-oriented programming, class hierarchies, model-view-controller patterns, and other ideas from Smalltalk are now commonplace, even though Smalltalk never saw widespread adoption. Haskell is in a similar situation today. It is an esoteric language, yet stable and reliable enough for serious work. Perhaps you will learn Haskell, or even use Haskell on a real-world project. Even if you don't, ideas from Haskell are slowly finding their way into mainstream languages like Java, C#, C++, Perl, Python, and Ruby. Haskell is poised to have a strong influence over the next 20 years of programming and language design, just as Smalltalk has had an influence during the last 20 years. Haskell is a general purpose, purely functional programming language featuring static typing, higher order functions, polymorphism, typeclasses and monadic effects. If you are an academic or a language designer, these terms may be meaningful. If you are a professional programmer, terms like static typing and polymorphism may sound familiar, even if this definition isn't particularly elucidating. Fundamentally, Haskell is a functional programming language, which means it is based on a form of lambda calculus. Because ideas from the functional programming world are appearing in mainstream languages, it is more important than ever to understand these techniques.


Sponsors

XML Daily Newslink and Cover Pages are sponsored by:

BEA Systems, Inc.http://www.bea.com
IBM Corporationhttp://www.ibm.com
Primetonhttp://www.primeton.com
SAP AGhttp://www.sap.com
Sun Microsystems, Inc.http://sun.com

XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: newsletter-subscribe@xml.coverpages.org
Newsletter unsubscribe: newsletter-unsubscribe@xml.coverpages.org
Newsletter help: newsletter-help@xml.coverpages.org
Cover Pages: http://xml.coverpages.org/



Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Globe Image

Document URI: http://xml.coverpages.org/newsletter/news2007-06-04.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org