Title: MATCH: MULTIMODAL ACCESS TO CITY HELP
Authors: Michael Johnston, Srinivas Bangalore, Gunaranjan Vasireddy
Abstract:
Interfaces to mobile information access devices need to allow users to interact using whichever mode or combination of modes are most appropriate, given their user preference, task at hand, and physical and social environment. This paper describes a multimodal application architecture which facilitates rapid prototyping of flexible next-generation multimodal interfaces. Our sample application MATCH (Multimodal Access To City Help) provides a mobile multimodal speech-pen interface to restaurant and subway information for New York City. Finite-state multimodal language processing technology enables input in pen, speech, or integrated combinations of the two. The system also features multimodal generation capabilities providing speech output synchronized with dynamic graphical displays.
|