Multisensory Perception of Emotion for Human and Chimpanzee Expressions by Humans

Marina Kawase, Ikuma Adachi, Akihiro Tanaka


We examined how we human perceive multimodal affective expressions in chimpanzees and whether the underlying cognitive systems are similar to those for human expressions. In the experiment, we presented audiovisual stimuli in which face and voice express congruent or incongruent emotions. Participants were instructed to ignore vocal emotion but to judge facial emotion. The results showed that in the chimpanzee stimuli, accuracy was marginally lower in the incongruent stimuli than in the congruent stimuli. Moreover, the congruency effect on positive human faces was marginally stronger than on positive chimpanzee faces. The gaze behavior results showed that human participants focused around the eye area when presented with positive and negative facial expressions of humans and negative facial expression of chimpanzees, while they focused both around the eye and the mouth areas when presented with positive facial expressions of chimpanzees. Our findings suggest that humans perceive affective expressions of other species multisensorily, and that underlying cognitive systems are not similar to those for humans.


 DOI: 10.21437/AVSP.2017-22

Cite as: Kawase, M., Adachi, I., Tanaka, A. (2017) Multisensory Perception of Emotion for Human and Chimpanzee Expressions by Humans. Proc. The 14th International Conference on Auditory-Visual Speech Processing, 115-118, DOI: 10.21437/AVSP.2017-22.


@inproceedings{Kawase2017,
  author={Marina Kawase and Ikuma Adachi and Akihiro Tanaka},
  title={ Multisensory Perception of Emotion for Human and Chimpanzee Expressions by Humans},
  year=2017,
  booktitle={Proc. The 14th International Conference on Auditory-Visual Speech Processing},
  pages={115--118},
  doi={10.21437/AVSP.2017-22},
  url={http://dx.doi.org/10.21437/AVSP.2017-22}
}