This issue of XML Daily Newslink is sponsored by:
IBM Corporation http://www.ibm.com
- W3C Last Call: Scalable Vector Graphics (SVG) Tiny 1.2 Specification
- Using METS, PREMIS, and MODS Standards for Archiving eJournals
- Take Advantage of Web 2.0 for Next-Generation BPM 2.0
- Using International Standards to Develop a Union Catalogue for Archives in Germany
- In Digital Age, Federal Files Blip Into Oblivion
- Sun Launches Open Suite for SWIFT Supporting SWIFTNet Integration
- Expert System Guns for Google with Semantic Search, Advertising
W3C Last Call: Scalable Vector Graphics (SVG) Tiny 1.2 Specification
Ola Andersson, Robin Berjon (et al., eds), W3C Technical Report
W3C announced that the SVG Working Group has published the Last Call Working Draft for the "Scalable Vector Graphics (SVG) Tiny 1.2 Specification." Comments are welcome through 13-October-2008. SVG is a language for describing two-dimensional graphics and graphical applications in XML. SVG documents can be interactive and dynamic. Animations can be defined and triggered either declaratively (i.e., by embedding SVG animation elements in SVG content) or via scripting. SVG 1.1 is a W3C Recommendation and forms the core of the current SVG developments. Under its current charter, the SVG Working Group continues to develop the SVG 1.x format, producing a modular XML tagset usable in mixed-XML namespace documents. SVG is a graphics format that has been implemented in viewers and authoring tools and has been adopted by the content authoring community as a replacement for many current uses of raster graphics. SVG 1.1, a W3C Recommendation, is suitable for desktop and palmtop systems. SVG Mobile Profiles: SVG Basic and SVG Tiny are suitable for mobile and other resource limited devices. SVG Tiny 1.2 is a profile of SVG intended for implementation on a range of devices, from cellphones and PDAs to desktop and laptop computers. The Tiny profile of SVG 1.2 consists of all of the features defined within this specification. As a baseline specification, it is possible for: superset profiles (e.g., SVG Full 1.2) which include all of the Tiny profile but add other features to the baseline; subset profiles; and special-purpose profiles which incorporate some modules from this specification in combination with other features as needed to meet particular industry requirements. The SVG Working Group plans to submit this Last Call specification for consideration as a W3C Proposed Recommendation once the Working Group has demonstrated at least two interoperable implementations for each test in the SVG Tiny 1.2 test suite; furthermore, at least one of the passing implementations must be on a mobile platform. The SVG Working Group, working closely with the developer community, expects to show these implementations by October 2008. This estimate is based on the Working Group's preliminary implementation report. The Working Group expects to revise this report over the course of the Last Call period.
Using METS, PREMIS, and MODS Standards for Archiving eJournals
Angela Dappert and Markus Enders, D-Lib Magazine
As institutions turn towards developing archival digital repositories, many decisions on the use of metadata have to be made. In addition to deciding on the more traditional descriptive and administrative metadata, particular care needs to be given to the choice of structural and preservation metadata, as well as to integrating the various metadata components. This article reports on the use of METS structural, PREMIS preservation and MODS descriptive metadata for the British Library's eJournal system. In order to understand metadata needs, it is helpful to understand the business processes and data structures. eJournals present a difficult domain for two reasons. The first is that eJournals are structurally complex. For each journal title, new issues are released in intervals. They may contain varying numbers of articles and other publishing matter. Articles are submitted in a variety of formats, which might vary from article to article within a single issue. The second reason is that, the production of eJournals is outside the control of the digital repository and is done without the benefit of standards for the structure of submission packages, file formats, metadata formats and vocabulary, publishing schedules, errata, etc. As a consequence, systems that handle eJournals need to accommodate a great variety of processes and formats. This article presents a solution that can accommodate the complexity and variety found in eJournals. Fortunately, there has been a substantial amount of work over recent years to define metadata specifications that can support complex cases such as eJournals. The Metadata Encoding and Transmission Specification (METS) provides a robust and flexible way to define digital objects. The Metadata Object Description Scheme (MODS) provides ways to describe objects, and builds on the library community's MARC tradition. Finally, the Preservation Metadata Implementation Strategy (PREMIS) data dictionary provides ways of describing objects and processes that are essential for digital preservation. These three metadata specifications are all built on an XML foundation. Their user communities and underlying approaches also have much in common. All of them are content-type independent, which makes it possible to define shared usage guidelines for the various content-types held in the archival store... No single existing metadata schema accommodates the representation of descriptive, preservation and structural metadata. This article shows how we use a combination of METS, PREMIS and MODS to represent eJournal Archival Information Packages in a write-once archival system.
Take Advantage of Web 2.0 for Next-Generation BPM 2.0
Pradip Roychowdhury and Diptiman Dasgupta, IBM developerWorks
To achieve business agility and flexibility, most organizations are adopting BPM as an approach for simulating, modeling, executing, monitoring, and managing key domain-aligned business processes by abstracting the underneath technical system, application logic, and other aspects of IT. Some organizations have started this initiative as part of a Service-Oriented Architecture (SOA) transformation of their IT infrastructure (a bottom-up approach), while others are using any out-of-the-box BPM tooling to first model and simulate the business processes and then deploy it into the runtime environment (a top-down approach). In both cases, organizations face challenges, including multiple and disparate data sources, multiple and isolated business processes, and lengthy and complex integration work. Also, most of the BPM tools available in the market lack support for business users, business-to-business (B2B) scenarios, and end-to-end traceability of the business model and IT components running in the production environment. This has created a technological understanding gap between business users and IT professionals. Business is best understood by the business analysts and subject-matter experts (SMEs), and a business model created by them should transform easily into reality by effective use tooling. This article describes how the advent of Web 2.0 and its enabling technology have created a ray of hope in this ever-disputed matter by reducing the dependency of the business analyst in building and managing the process models on their IT counterparts... Major features for BPM 2.0 include a rich user experience (interfaces use Ajax, RSS feeds, and Representational State Transfer (REST) APIs to provide a reasonably lightweight Web UI experience; process tagging, and a lightweight integration model. The products in BPM 2.0 space, also known as Business Process Management Suits (BPMS), have adopted Business Process Modeling Notation (BPMN) and Business Process Execution Language (BPEL) as modeling and execution standards. Earlier, most of the workflow-centric BPM products were based on proprietary modeling notations. They were also incompatible with each other. Also, there were multiple standards like XLANG from Microsoft and Web Services Flow Language (WSFL) from IBM for executable processes. However, with the acceptance of BPMN as the standard notation for modeling executable business process, this issue has been resolved in BPM 2.0. BPM 2.0 has adopted BPEL as the standard for process execution... BPM 2.0 is Web 2.0-centric, introducing some challenges for organizations that are similar to Web 2.0 challenges. Previously, there was limited information sharing between departments within the organization, meaning there wasn't much agility in business processes. In BPM 2.0, content doesn't remain centralized within the organization. Using Web 2.0, BPM 2.0 increases participation levels of users in creating, modifying, and managing a business process. To do this, every user has access to business content, and each user can change the content. This can render content administration within an organization more decentralized.
See also: BPEL references
Using International Standards to Develop a Union Catalogue for Archives in Germany
Andres Imhof, D-Lib Magazine
This article reports on a current project of the Bundesarchiv (National Archives of Germany), funded by the Deutsche Forschungsgemeinschaft. The project brings together in a collaborative portal the finding aids from archival records on SED (Socialist Unity Party of Germany) and FDGB (Free German Trade Union Federation) from five eastern German state archives. To accomplish this the locally available, heterogeneous data formats must be transformed into a common profile of the international standard format for inventories: Encoded Archival Description (EAD). In addition to EAD, the Encoded Archival Context (EAC) is applied for the presentation of the provenance of the archival materials and the Encoded Archival Guide (EAG) is used for the information on the archives themselves... The union catalogue uses Lucene, which is open source technology from the Apache Software Foundation project. The Lucene technology is able to index content in numerous formats. For the union catalogue being created by our project, Extensible Markup Language (XML) documents are processed. For the archival records of the SED and FDGB network, the standardised description format Encoded Archival Description (EAD) is used, which is coded in the XML syntax. We use the latest version, EAD 2002, which is an international standard developed in the USA and now established in Europe as well. EAD 2002 aims to be more compatible with the rules of the General International Standard Archival Description (ISAD(G)), which were adopted in 2000 by the International Council on Archives (ICA). However, EAD 2002 is substantially more comprehensive in describing elements than ISAD(G). The Bundesarchiv project applies Encoded Archival Context (EAC) for the description of provenance, where provenance can refer to persons, corporations and families. EAC is not yet an agreed standard. It is present as a 2004 working draft and conforms to the ICA standard for standard files in archives, the International Standard Archival Authority Record for Corporate Bodies, Persons, and Families. The EAC Working Group was supported by the LEAF (Linking and Exploring Authority Files) project es and archives cooperate. Currently, the EAC Working Group of the Society of American Archivists is in charge of the further development of EAC... Even if a structured, international standard format is used like EAD for the production of online archival inventories, the archival contents remain heterogeneous to the contents of libraries. Reasons for this are the differences between library and archive materials, and the different needs and traditions of libraries and archives. If, in an information portal, a single object with a flat description structure meets a fonds with hierarchically arranged description data, this has to be accounted for conceptually for effective user search and navigation, and to cope equally with the differing qualities of publications and inventories.
In Digital Age, Federal Files Blip Into Oblivion
Robert Pear, New York Times
"Countless federal records are being lost to posterity because U.S. federal employees, grappling with a staggering growth in electronic records, do not regularly preserve the documents they create on government computers, send by e-mail and post on the Web. Federal agencies have rushed to embrace the Internet and new information technology, but their record-keeping efforts lag far behind. Moreover, federal investigators have found widespread violations of federal record-keeping requirements. Many federal officials admit to a haphazard approach to preserving e-mail and other electronic records of their work. Indeed, many say they are unsure what materials they are supposed to preserve. This confusion is causing alarm among historians, archivists, librarians, Congressional investigators and watchdog groups that want to trace the decision-making process and hold federal officials accountable. With the imminent change in administrations, the concern about lost records has become more acute... Richard Pearce-Moses, a former president of the Society of American Archivists, said, 'My biggest worry is that even with the best and brightest minds working on this problem, the risks are so great that we may lose significant portions of our history.' The Web site of the Environmental Protection Agency lists more than 50 broken links that once connected readers to documents on depletion of the ozone layer of the atmosphere. At least 20 documents have been removed from the Web site of the United States Commission on Civil Rights... For the federal government, the challenge of preserving records grows each month, as employees create billions of e-mail messages. E-mail often replaces telephone conversations and meetings that would not have been recorded in the past. In an effort to save money, federal agencies are publishing fewer reports on paper and posting more on the Web. Increasingly, federal officials use blogs, podcasts and videos to announce and defend their policies. Growing numbers of federal employees do government business outside the office on personal computers, using portable flash drives and e-mail services like Google Gmail and Microsoft Hotmail. In the past, clerks put most important government records in central agency files. But record-keeping has become decentralized, and the government has fewer clerical employees. Federal employees say they store many official records on desktop computers, so the records are not managed in a consistent way. Kenneth Thibodeau, director of the electronic records archives program at the National Archives, said that 32 million White House e-mail messages had been preserved as records of the Clinton administration. He expects to receive hundreds of millions from the Bush White House...
Sun Launches Open Suite for SWIFT Supporting SWIFTNet Integration
Staff, Sun Microsystems Announcement
Sun Microsystems has announced the launch of the Sun Open Suite for SWIFT, a complete end-to-end software and hardware infrastructure solution to help corporate treasuries and financial institutions kick-start fast and secure SWIFTNet integration. Corporations, banks and securities firms initiate new SWIFT projects for various reasons. For example, SWIFTNet integration could help simplify government regulation compliance efforts, streamline industry initiatives and speed replacement of legacy software. Whether starting new SWIFT projects or consolidating existing SWIFTNet access points, the Sun Open Suite for SWIFT solution could help minimize implementation risk and cost, simplify adoption of new SWIFT technology standards and speed deployment of SWIFT FIN-based message flows and new XML-based SWIFT Solutions. In order to connect to SWIFT, organizations need an integration solution that enables back office applications with the ability to communicate with SWIFT. This allows for automation and standardization of financial transactions, thereby lowering costs, reducing operation risks and eliminating inefficiencies... The Open Suite for SWIFT solution comprises Sun's leading edge financial services products—the Sun Java Composite Application Platform Suite (Java CAPS), hardware, operating systems, high-availability clustering options—and a partner network for rapid deployment, all brought together to suit the specific needs of SWIFT participants, including those preparing to join SWIFT and those consolidating or upgrading current SWIFT infrastructure. Java CAPS was recently certified with the SWIFTReady Financial EAI Gold Label, for the tenth consecutive year—confirming that the Sun solution is fully compliant with current requirements of SWIFT network. [Also] Grupo Acotel has announced that it will launch new SWIFT-ready services to help corporate and bank customers in Spain better manage financial communications via the Society for Worldwide Interbank Financial Telecommunication (SWIFT) platform. Based on the Sun Microsystems Open Suite for SWIFT, the Acotel solution is a cost-effective, fully managed service that allows customers to process payments and facilitate SWIFT communications including account statements, payments, transfers and debit services without the need to set up, integrate and support complex infrastructure.
See also: SWIFT and ISO 15022
Expert System Guns for Google with Semantic Search, Advertising
Clint Boulton, eWEEK
Would-be Google AdSense rival Expert System launches its Cogito Semantic Advertiser tool, which discerns the meaning and context of words to provide more relevant ads. By leveraging semantic technologies, Expert System joins a cadre of search providers that includes Microsoft-owned Powerset, Hakia, Yedda and Zoomix... Expert System, which makes software that interprets information in text, has launched Cogito Semantic Advertiser, a tool that, when paired with Expert's semantic search engine, processes the meaning of text to ensure that an ad's placement is relevant to its assigned Web pages. Expert System is the latest vendor trying to nibble at Google's search advertising market share by offering an alternative to Google AdSense, which uses keyword frequency to place ads. Another upstart, Proximic, debuted earlier in 2008 to attack Google on this front, but Proximic leverages a contextual ad-based approach without semantics. So, what problem is Expert System trying to solve? J. Brooke Aker, CEO of Italy-based Expert System's U.S. subsidiary, showed how Google's AdSense program will occasionally place misleading or inappropriate ads for certain articles. For example, next to a New York Times science story on a jaguar, the Google AdSense algorithm picked out ads for Jaguar automobiles. In another article on an airline disaster, AdSense showed ads about low airfare rates for vacations. The algorithm didn't intend to provide results in poor taste, of course, and neither did Google's human engineers. The problem is that AdSense relies on keyword frequency but doesn't drill down into the semantics—the meaning in the words. Cogito Semantic Advertiser attempts to go further by using semantic intelligence to analyze the text on each page and ensure that ads are placed appropriately to increase click-through rates. Cogito Semantic Advertiser understands content based on four key methodologies: studying the morphology of words; looking at parts of speech; sentence logic, or the reduction of sentences to subject, verb and object; and disambiguation, which in the case of the jaguar story paired with the Jaguar car ad would determine whether or not the text referred to a car or an animal. Cogito Semantic Advertiser also provides details on actual use of specific Web sites, so companies can tailor their messages and coordinate the timing of ad placements.
XML Daily Newslink and Cover Pages sponsored by:
|Sun Microsystems, Inc.||http://sun.com|
XML Daily Newslink: http://xml.coverpages.org/newsletter.html
Newsletter Archive: http://xml.coverpages.org/newsletterArchive.html
Newsletter subscribe: email@example.com
Newsletter unsubscribe: firstname.lastname@example.org
Newsletter help: email@example.com
Cover Pages: http://xml.coverpages.org/