4th International Conference on Spoken Language Processing

Philadelphia, PA, USA
October 3-6, 1996

Language-model Look-ahead for Large Vocabulary Speech Recognition

S. Ortmanns, Hermann Ney, A. Eiden

Lehrstuhl für Informatik VI, RWTH Aachen, University of Technology, Aachen, Germany

In this paper, we present an efficient look-ahead technique which incorporates the language model knowledge at the earliest possible stage during the search process. This so-called language model look-ahead is built into the time synchronous beam search algorithm using a tree-organized pronunciation lexicon for a bigram language model. The language model look-ahead technique exploits the full knowledge of the bigram language model by distributing the language model probabilities over the nodes of the lexical tree for each predecessor word. We present a method for handling the resulting memory requirements. The recognition experiments performed on the 20 000-word North American Business task (Nov.’96) demonstrate that in comparison with the unigram look-ahead a reduction by a factor of 5 in the acoustic search effort can be achieved without loss in recognition accuracy.

Full Paper

Bibliographic reference.  Ortmanns, S. / Ney, Hermann / Eiden, A. (1996): "Language-model look-ahead for large vocabulary speech recognition", In ICSLP-1996, 2095-2098.