INTERSPEECH 2011
12th Annual Conference of the International Speech Communication Association

Florence, Italy
August 27-31. 2011

Visual Speech Speeds Up Auditory Identification Responses

Tim Paris, Jeesun Kim, Chris Davis

University of Western Sydney, Australia

Auditory speech perception is more accurate when combined with visual speech. Recent ERP studies suggest that visual speech helps 'predict' which phoneme will be heard via feedback from visual to auditory areas, with more visual salient articulations associated with greater facilitation. Two experiments tested this hypothesis with a speeded auditory identification measure. Stimuli consisted of the sounds 'apa', 'aka' and 'ata', with matched and mismatched videos that showed the talker's whole face or upper face (control). The percentage of matched AV videos was set at 85% in Experiment 1 and 15% in Experiment 2. Results showed that responses to matched whole face stimuli were faster than both upper face and mismatched videos in both experiments. Furthermore, salient phonemes (aPa) showed a greater reduction in reaction times than ambiguous ones (aKa). The current study provides support for the proposal that visual speech speeds up processing of auditory speech.

Full Paper

Bibliographic reference.  Paris, Tim / Kim, Jeesun / Davis, Chris (2011): "Visual speech speeds up auditory identification responses", In INTERSPEECH-2011, 2469-2472.