4th International Conference on Spoken Language Processing

Philadelphia, PA, USA
October 3-6, 1996

Entropy Coded Vector Quantization with Hidden Markov Models

Tadashi Yonezaki (1), Kiyohiro Shikano (2)

(1) Telecom Research Lab., Matushita Communication Ind.Co.,Ltd., Yokohama, Japan
(2) Graduate School of Informatoin Science, Nara Institute of Science and Technology, Nara, Japan

We propose a new vector quantization approach, which consists of Hidden Markov Models(HMMs) and entropy coding scheme. The entropy coding system is determined depending on the speech status modeled by HMMs, so the proposing approach can adaptively allocate suitable numbers of bits to the codewords. This approach realizes about 0.3[dB] coding gain in cepstrum distance(8 states HMMs). In other words, 8 bit-codebook is represented by about 6.5 bits for average code length. We also research for robustness to the channel error. HMMs and the entropy coding system, which seem to be weak to the channel error, are augmented to be robust, so that the influence of the channel error is decreased into one-third.

Full Paper

Bibliographic reference.  Yonezaki, Tadashi / Shikano, Kiyohiro (1996): "Entropy coded vector quantization with hidden Markov models", In ICSLP-1996, 310-313.