4th International Conference on Spoken Language Processing
Philadelphia, PA, USA
In this paper, an extension of n-grams is proposed. In this extension, the memory of the model (n) is not fixed a priori. Instead, first, large memories are accepted and afterwards, merging criteria are applied to reduce complexity and to ensure reliable estimations. The results show how the perplexity obtained with x-grams is smaller than that of n-grams. Furthermore, the complexity is smaller than trigrams and can become close to bigrams.
Bibliographic reference. Bonafonte, Antonio / Mariño, José B. (1996): "Language modeling using x-grams", In ICSLP-1996, 394-397.