Auditory-Visual Speech Processing (AVSP) 2010

Hakone, Kanagawa, Japan
September 30-October 3, 2010

Infants Match Auditory and Visual Speech in Schematic Point-Light Displays

Christine Kitamura, Jeesun Kim

MARCS Auditory Laboratories, University of Western Sydney, Australia

Infants’ sensitivity to visual prosodic motion in infant-directed speech was examined by testing whether 8-month-olds can match an audio-only sentence with its visual-only schematic point-light display. The visual stimuli were sentence pairs of equal duration but unequal syllable number recorded using Optotrak. Twelve of the fourteen 8-month-olds tested looked longer at visual speech motion that matched the audio version of a sentence. This result suggests that the infants can perceive the underlying speech gestures signalled by schematic pointlight displays, and more importantly that they are sensitive to, and able to extract the syllable structure of speech from the talker’s moving face and head.

Index Terms: infant perception, head and face motion, A-V matching

Full Paper

Bibliographic reference.  Kitamura, Christine / Kim, Jeesun (2010): "Infants match auditory and visual speech in schematic point-light displays", In AVSP-2010, paper S6-1.