Depthwise Separable Convolutional ResNet with Squeeze-and-Excitation Blocks for Small-Footprint Keyword Spotting

Menglong Xu, Xiao-Lei Zhang


One difficult problem of keyword spotting is how to miniaturize its memory footprint while maintain a high precision. Although convolutional neural networks have shown to be effective to the small-footprint keyword spotting problem, they still need hundreds of thousands of parameters to achieve good performance. In this paper, we propose an efficient model based on depthwise separable convolution layers and squeeze-and-excitation blocks. Specifically, we replace the standard convolution by the depthwise separable convolution, which reduces the number of the parameters of the standard convolution without significant performance degradation. We further improve the performance of the depthwise separable convolution by reweighting the output feature maps of the first convolution layer with a so-called squeeze-and-excitation block. We compared the proposed method with five representative models on two experimental settings of the Google Speech Commands dataset. Experimental results show that the proposed method achieves the state-of-the-art performance. For example, it achieves a classification error rate of 3.29% with a number of parameters of 72K in the first experiment, which significantly outperforms the comparison methods given a similar model size. It achieves an error rate of 3.97% with a number of parameters of 10K, which is also slightly better than the state-of-the-art comparison method given a similar model size.


 DOI: 10.21437/Interspeech.2020-1045

Cite as: Xu, M., Zhang, X. (2020) Depthwise Separable Convolutional ResNet with Squeeze-and-Excitation Blocks for Small-Footprint Keyword Spotting. Proc. Interspeech 2020, 2547-2551, DOI: 10.21437/Interspeech.2020-1045.


@inproceedings{Xu2020,
  author={Menglong Xu and Xiao-Lei Zhang},
  title={{Depthwise Separable Convolutional ResNet with Squeeze-and-Excitation Blocks for Small-Footprint Keyword Spotting}},
  year=2020,
  booktitle={Proc. Interspeech 2020},
  pages={2547--2551},
  doi={10.21437/Interspeech.2020-1045},
  url={http://dx.doi.org/10.21437/Interspeech.2020-1045}
}