An announcement from the SALT Forum describes the contribution of the Speech Application Language Tags specification to the World Wide Web Consortium (W3C). The SALT Forum has "asked the W3C Multimodal Interaction Working Group and Voice Browser Working Group to review the SALT specification as part of their development of standards for promoting multimodal interaction and voice-enabling the Web." The contribution is said to furthers the SALT Forum's goal of "establishing an open, royalty-free standard for speech-enabling multimodal and telephony applications." On July 15, 2002 the SALT Forum announced the public release of SALT Version 1.0. Version 1.0 of the SALT specification covers three broad areas of capabilities: speech output, speech input and call control. The specification's 'prompt' tag allows SALT-based applications to play audio and synthetic speech directly, while 'listen' and 'bind' tags provide speech recognition capabilities by collecting and processing spoken user input. In addition, the specification's call control object can be used to provide SALT-based applications with the ability to place, answer, transfer and disconnect calls, along with advanced capabilities such as conferencing, The SALT specification thus "defines a set of lightweight tags as extensions to commonly used Web-based markup languages. This allows developers to add speech interfaces to Web content and applications using familiar tools and techniques. The SALT specification is designed to work equally well on a wide variety of computing and communicating devices."
From the SALT Forum announcement:
There is a growing interest in multimodal applications, including the use of speech technologies to expand the reach of Web content. "Many companies within the speech technology industry view standards as key to speeding acceptance of speech-enabled multimodal applications," said Bill Meisel, president of TMA Associates. "The contribution of the SALT specification to the W3C gives the group a robust starting point for creating a single standard for multimodal applications, which is ultimately in the best interest for the speech industry."
"We respect the standards efforts of the W3C and are pleased to bring the SALT specification to W3C Working Groups for their consideration," said SALT Forum representative Martin Dragomirecky. "By making a comprehensive royalty-free contribution we hope to accelerate their efforts targeting a new class of mobile devices that support multiple modes of interaction."
The SALT Forum brings together a diverse group of companies sharing a common interest in developing and promoting speech technologies for multimodal and telephony applications. Founded in 2001 by Cisco, Comverse, Intel, Microsoft, Philips and SpeechWorks, the Forum seeks to develop a royalty-free standard that augments existing XML-based markup languages to provide spoken access to many forms of content through a wide variety of devices.
W3C Multimodal Interaction Activity: "The Multimodal Interaction Activity is extending the Web user interface to allow multiple modes of interaction, offering users the choice of using their voice, or the use of a key pad, keyboard, mouse, stylus or other input device. For output, users will be able to listen to spoken prompts and audio, and to view information on graphical displays. The Working Group is developing markup specifications for synchronization across multiple modalities and devices with a wide range of capabilities. The specifications should be implementable on a royalty-free basis."
W3C Voice Browser Activity: "The W3C Voice Browser working group is defining a suite of markup languages covering dialog, speech synthesis, speech recognition, call control and other aspects of interactive voice response applications. VoiceXML is a dialog markup language designed for telephony applications, where users are restricted to voice and DTMF (touch tone) input. The other specifications are being designed for use in a variety of contexts, and not just with VoiceXML. Further work is anticipated on enabling their use with other W3C markup languages such as XHTML, XForms and SMIL. This will be done in conjunction with other W3C working groups, including the new Multimodal Interaction Activity..."
Principal references:
- Announcement 2002-08-13: "SALT Forum Contributes Speech Application Language Tags Specification Version 1.0 to World Wide Web Consortium. Contribution Furthers Goal of Establishing Open, Royalty-Free Standard for Speech-Enabling Multimodal and Telephony Applications."
- "SALT Forum Submits Multimodal Spec to W3C." By Ephraim Schwartz. In InfoWorld (August 13, 2002).
- "SALT Forum Submits Spec to W3C." By Dennis Callaghan. In eWEEK (August 13, 2002).
- W3C Multimodal Interaction Activity
- W3C Voice Browser Activity
- Speech Application Language Tags (SALT) Forum website
- SALT information from Microsoft
- SALT News from Microsoft
- "SALT Forum Publishes Speech Application Language Tags (SALT) Version 1.0." News item July 17, 2002.
- "Speech Application Language Tags (SALT)" - Main reference page.