The Cover PagesThe OASIS Cover Pages: The Online Resource for Markup Language Technologies
SEARCH | ABOUT | INDEX | NEWS | CORE STANDARDS | TECHNOLOGY REPORTS | EVENTS | LIBRARY
SEARCH
Advanced Search
ABOUT
Site Map
CP RSS Channel
Contact Us
Sponsoring CP
About Our Sponsors

NEWS
Cover Stories
Articles & Papers
Press Releases

CORE STANDARDS
XML
SGML
Schemas
XSL/XSLT/XPath
XLink
XML Query
CSS
SVG

TECHNOLOGY REPORTS
XML Applications
General Apps
Government Apps
Academic Apps

EVENTS
LIBRARY
Introductions
FAQs
Bibliography
Technology and Society
Semantics
Tech Topics
Software
Related Standards
Historic
Created: August 14, 2002.
News: Cover StoriesPrevious News ItemNext News Item

SALT Forum Contributes Speech Application Language Tags Specification to W3C.

An announcement from the SALT Forum describes the contribution of the Speech Application Language Tags specification to the World Wide Web Consortium (W3C). The SALT Forum has "asked the W3C Multimodal Interaction Working Group and Voice Browser Working Group to review the SALT specification as part of their development of standards for promoting multimodal interaction and voice-enabling the Web." The contribution is said to furthers the SALT Forum's goal of "establishing an open, royalty-free standard for speech-enabling multimodal and telephony applications." On July 15, 2002 the SALT Forum announced the public release of SALT Version 1.0. Version 1.0 of the SALT specification covers three broad areas of capabilities: speech output, speech input and call control. The specification's 'prompt' tag allows SALT-based applications to play audio and synthetic speech directly, while 'listen' and 'bind' tags provide speech recognition capabilities by collecting and processing spoken user input. In addition, the specification's call control object can be used to provide SALT-based applications with the ability to place, answer, transfer and disconnect calls, along with advanced capabilities such as conferencing, The SALT specification thus "defines a set of lightweight tags as extensions to commonly used Web-based markup languages. This allows developers to add speech interfaces to Web content and applications using familiar tools and techniques. The SALT specification is designed to work equally well on a wide variety of computing and communicating devices."

From the SALT Forum announcement:

There is a growing interest in multimodal applications, including the use of speech technologies to expand the reach of Web content. "Many companies within the speech technology industry view standards as key to speeding acceptance of speech-enabled multimodal applications," said Bill Meisel, president of TMA Associates. "The contribution of the SALT specification to the W3C gives the group a robust starting point for creating a single standard for multimodal applications, which is ultimately in the best interest for the speech industry."

"We respect the standards efforts of the W3C and are pleased to bring the SALT specification to W3C Working Groups for their consideration," said SALT Forum representative Martin Dragomirecky. "By making a comprehensive royalty-free contribution we hope to accelerate their efforts targeting a new class of mobile devices that support multiple modes of interaction."

The SALT Forum brings together a diverse group of companies sharing a common interest in developing and promoting speech technologies for multimodal and telephony applications. Founded in 2001 by Cisco, Comverse, Intel, Microsoft, Philips and SpeechWorks, the Forum seeks to develop a royalty-free standard that augments existing XML-based markup languages to provide spoken access to many forms of content through a wide variety of devices.

W3C Multimodal Interaction Activity: "The Multimodal Interaction Activity is extending the Web user interface to allow multiple modes of interaction, offering users the choice of using their voice, or the use of a key pad, keyboard, mouse, stylus or other input device. For output, users will be able to listen to spoken prompts and audio, and to view information on graphical displays. The Working Group is developing markup specifications for synchronization across multiple modalities and devices with a wide range of capabilities. The specifications should be implementable on a royalty-free basis."

W3C Voice Browser Activity: "The W3C Voice Browser working group is defining a suite of markup languages covering dialog, speech synthesis, speech recognition, call control and other aspects of interactive voice response applications. VoiceXML is a dialog markup language designed for telephony applications, where users are restricted to voice and DTMF (touch tone) input. The other specifications are being designed for use in a variety of contexts, and not just with VoiceXML. Further work is anticipated on enabling their use with other W3C markup languages such as XHTML, XForms and SMIL. This will be done in conjunction with other W3C working groups, including the new Multimodal Interaction Activity..."

Principal references:


Hosted By
OASIS - Organization for the Advancement of Structured Information Standards

Sponsored By

IBM Corporation
ISIS Papyrus
Microsoft Corporation
Oracle Corporation

Primeton

XML Daily Newslink
Receive daily news updates from Managing Editor, Robin Cover.

 Newsletter Subscription
 Newsletter Archives
Bottom Globe Image

Document URI: http://xml.coverpages.org/ni2002-08-14-a.html  —  Legal stuff
Robin Cover, Editor: robin@oasis-open.org