RECOApy: Data Recording, Pre-Processing and Phonetic Transcription for End-to-End Speech-Based Applications

Adriana Stan


Deep learning enables the development of efficient end-to-end speech processing applications while bypassing the need for expert linguistic and signal processing features. Yet, recent studies show that good quality speech resources and phonetic transcription of the training data can enhance the results of these applications. In this paper, the RECOApy tool is introduced. RECOApy streamlines the steps of data recording and pre-processing required in end-to-end speech-based applications. The tool implements an easy-to-use interface for prompted speech recording, spectrogram and waveform analysis, utterance-level normalisation and silence trimming, as well grapheme-to-phoneme conversion of the prompts in eight languages: Czech, English, French, German, Italian, Polish, Romanian and Spanish.

The grapheme-to-phoneme (G2P) converters are deep neural network (DNN) based architectures trained on lexicons extracted from the Wiktionary online collaborative resource. With the different degree of orthographic transparency, as well as the varying amount of phonetic entries across the languages, the DNN’s hyperparameters are optimised with an evolution strategy. The phoneme and word error rates of the resulting G2P converters are presented and discussed. The tool, the processed phonetic lexicons and trained G2P models are made freely available.


 DOI: 10.21437/Interspeech.2020-1184

Cite as: Stan, A. (2020) RECOApy: Data Recording, Pre-Processing and Phonetic Transcription for End-to-End Speech-Based Applications. Proc. Interspeech 2020, 586-590, DOI: 10.21437/Interspeech.2020-1184.


@inproceedings{Stan2020,
  author={Adriana Stan},
  title={{RECOApy: Data Recording, Pre-Processing and Phonetic Transcription for End-to-End Speech-Based Applications}},
  year=2020,
  booktitle={Proc. Interspeech 2020},
  pages={586--590},
  doi={10.21437/Interspeech.2020-1184},
  url={http://dx.doi.org/10.21437/Interspeech.2020-1184}
}