EEG-Based Short-Time Auditory Attention Detection Using Multi-Task Deep Learning

Zhuo Zhang, Gaoyan Zhang, Jianwu Dang, Shuang Wu, Di Zhou, Longbiao Wang


A healthy person can attend to one speech in a multi-speaker scenario, however, this ability is not available to some people suffering from hearing impairments. Therefore, research on auditory attention detection based on electroencephalography (EEG) is a possible way to help hearing-impaired listeners detect the focused speech. Many previous studies used linear models or deep learning to decode the attended speech, but the cross-subject decoding accuracy is low, especially within a short time duration. In this study, we propose a multi-task learning model based on convolutional neural networks (CNN) to simultaneously perform attention decoding and reconstruct the attended temporal amplitude envelopes (TAEs) in a 2s time condition. The experimental results show that, compared to the traditional linear method, both the subject-specific and cross-subject decoding performance showed great improvement. Particularly, the cross-subject decoding accuracy was improved from 56% to 82% in 2s condition in the dichotic listening experiment. Furthermore, it was found that the frontal and temporal regions of the brain were more important for the detection of auditory attention by analyzing the channel contribution map. In summary, the proposed method is promising for nerve-steered hearing aids which can help hearing-impaired listeners to make faster and accurate attention detection.


 DOI: 10.21437/Interspeech.2020-2013

Cite as: Zhang, Z., Zhang, G., Dang, J., Wu, S., Zhou, D., Wang, L. (2020) EEG-Based Short-Time Auditory Attention Detection Using Multi-Task Deep Learning. Proc. Interspeech 2020, 2517-2521, DOI: 10.21437/Interspeech.2020-2013.


@inproceedings{Zhang2020,
  author={Zhuo Zhang and Gaoyan Zhang and Jianwu Dang and Shuang Wu and Di Zhou and Longbiao Wang},
  title={{EEG-Based Short-Time Auditory Attention Detection Using Multi-Task Deep Learning}},
  year=2020,
  booktitle={Proc. Interspeech 2020},
  pages={2517--2521},
  doi={10.21437/Interspeech.2020-2013},
  url={http://dx.doi.org/10.21437/Interspeech.2020-2013}
}