Auditory-Visual Speech Processing (AVSP) 2009
University of East Anglia, Norwich, UK
By using event-related potentials (ERPs, Experiment 2) and reaction times (RTs, Experiment 1), the present study examined interlanguage differences between Japanese and English in audiovisual speech perception [1, 2, 3, 4]. There were an auditory-only (AO) and a congruent auditory-visual (AV) conditions. In Experiment 1, RTs showed opposite tendencies between English-language (EL) and Japanese-language (JL) groups for the AO-AV relationship: the additional congruent visual information speeded up the speech perception processes for the EL group, but it slowed down the processes for the JL group. Thus, the visual influence was promoting for the EL but disturbing for the JL group. In Experiment 2, different ERP patterns were found between the EL and JL groups: Whereas the visual influence was sustained (maintained from N1 to P2) in the EL group, the influence was transient (limited only to N1) in the JL group. The ERPs and RTs data were both consistent with the reported interlanguage differences that the JL perceivers use visual information to the less extent than the EL perceivers do.
Index Terms: native language, audiovisual speech perception, event-related potentials (ERPs)
Bibliographic reference. Hisanaga, Satoko / Sekiyama, Kaoru / Igasaki, Tomohiko / Murayama, Nobuki (2009): "Audiovisual speech perception in Japanese and English: inter-language differences examined by event-related potentials", In AVSP-2009, 38-42.