EUROSPEECH 2001 Scandinavia
7th European Conference on Speech Communication and Technology

Aalborg, Denmark
September 3-7, 2001


Towards SMIL as a Foundation for Multimodal, Multimedia Applications

Jennifer L. Beckham (2), Giuseppe Di Fabbrizio (1), Nils Klarlund (1)

(1) AT&T - Labs Research, USA (2) University of Wisconsin, USA

Rich and interactive multimedia applications, where audio, video, graphics and text are precisely synchronized under timing constraints are becoming ubiquitous. Multimodal applications further extend the concept of user interaction combining different modalities, like speech recognition, speech synthesis and gestures. However, authoring dialog-capable multimodal, multimedia services is a very difficult task. In this paper, we argue that SMIL is an ideal substrate for extending multimedia applications with multimodal facilities. SMIL as it stands is not a general notation for controlling media and input mode resources. We show that all what is needed are few natural extensions to SMIL along with the addition of a simple reactive programming language that we call ReX. Our language is designed to be maximally compatible with existing W3C recommendations through a generic event system based on DOM and an expression language based on XPATH.

Full Paper

Bibliographic reference.  Beckham, Jennifer L. / Fabbrizio, Giuseppe Di / Klarlund, Nils (2001): "Towards SMIL as a foundation for multimodal, multimedia applications", In EUROSPEECH-2001, 1363-1366.