Auditory-Visual Speech Processing (AVSP) 2010

Hakone, Kanagawa, Japan
September 30-October 3, 2010

Brain Regions Differentially Involved with Multisensory and Visual only Speech Gesture Information

Daniel E. Callan

ATR Neural Information Analysis Laboratories, Kyoto, Japan

In this study a vowel identification task, controlling for intelligibility confounds, using audio visual stimuli at different signal to noise levels as well as visual only stimuli, is conducted to investigate neural processes involved with visual gesture information for speech perception. The fMRI results suggest that visual speech gesture information may serve to facilitate speech perception utilizing multiple distinct neural processes involved with multisensory integration (STG/S) and internal simulation of speech production (PMC).

Index Terms: multisensory integration, internal model, fMRI, superior temporal gyrus/sulcus STG/S, premotor cortex PMC

Full Paper

Bibliographic reference.  Callan, Daniel E. (2010): "Brain regions differentially involved with multisensory and visual only speech gesture information", In AVSP-2010, paper S5-1.