13th Annual Conference of the International Speech Communication Association

Portland, OR, USA
September 9-13, 2012

Real-time Visualization of English Pronunciation on an IPA Chart Based on Articulatory Feature Extraction

Yurie Iribe (1), Takurou Mori (1), Kouichi Katsurada (1), Goh Kawai (2), Tsuneo Nitta (1)

(1) Graduate School of Engineering, Toyohashi University of Technology, Aichi, Japan
(2) Research Faculty of Media and Communication, Hokkaido University, Hokkaido, Japan

In recent years, Computer Assisted Pronunciation Technology (CAPT) systems have been developed that can help Japanese learners to study foreign languages. We have been developing a pronunciation training system to evaluate and correct learner's pronunciation by extracting articulatory-features (AFs). In this paper, we propose a novel pronunciation training system that can plot the place and manner of articulation of learner's pronunciation on an International Phonetic Alphabet (IPA) chart in real time. First, the proposed system converts input speech into AF-sequences by using multi-layer neural networks (MLNs). Then, the AF-sequences are converted into x-y coordinates and plotted on an IPA chart to show his/her articulation in real time. Lastly, we investigate plotting accuracies on the IPA chart through experimental evaluation.

Index Terms: pronunciation training, articulatory feature, IPA chart

Full Paper

Bibliographic reference.  Iribe, Yurie / Mori, Takurou / Katsurada, Kouichi / Kawai, Goh / Nitta, Tsuneo (2012): "Real-time visualization of English pronunciation on an IPA chart based on articulatory feature extraction", In INTERSPEECH-2012, 1271-1274.