4th International Conference on Spoken Language Processing
Philadelphia, PA, USA
In this paper, several analyses relating facial motion with perioral muscle behavior and speech acoustics are described. The results suggest that linguistically relevant visual information is distributed over large regions of the face and can be modeled from the same control source as the acoustics.
Bibliographic reference. Vatikiotis-Bateson, E. / Munhall, K. G. / Kasahara, Y. / Garcia, F. / Yehia, H. (1996): "Characterizing audiovisual information during speech", In ICSLP-1996, 1485-1488.