Processing of visuo-auditory prosodic information in cochlear-implanted patients deaf patients

Pascal Barone, Mathieu Marx, Anne Lasfargues-Delannoy


Linguistic prosody is a poorly treated subject in cochlear- implanted (CI) patients. Our study investigated how CI patients discriminate a question from a statement based only on linguistic prosodic cues in three conditions: visual, auditory, and visuo-auditory. The results demonstrate that CI patients are not better performers than normal-hearing subjects (NHS) in the visual only condition, but that they have better multi-sensory integration skills than controls. During audiovisual stimulation, CI patients fuse the auditory and visual information to enable a better discrimination of prosodic cues. This study confirms the importance of further research into CI patients prosodic discrimination skills notably concerning visual prosody and eye-tracking analysis.


 DOI: 10.21437/AVSP.2017-16

Cite as: Barone, P., Marx, M., Lasfargues-Delannoy, A. (2017) Processing of visuo-auditory prosodic information in cochlear-implanted patients deaf patients. Proc. The 14th International Conference on Auditory-Visual Speech Processing, 84-88, DOI: 10.21437/AVSP.2017-16.


@inproceedings{Barone2017,
  author={Pascal Barone and Mathieu Marx and Anne Lasfargues-Delannoy},
  title={ Processing of visuo-auditory prosodic information in cochlear-implanted patients deaf patients},
  year=2017,
  booktitle={Proc. The 14th International Conference on Auditory-Visual Speech Processing},
  pages={84--88},
  doi={10.21437/AVSP.2017-16},
  url={http://dx.doi.org/10.21437/AVSP.2017-16}
}