EUROSPEECH 2001 Scandinavia
7th European Conference on Speech Communication and Technology
2nd INTERSPEECH Event

Aalborg, Denmark
September 3-7, 2001

                 

Scaled Likelihood Linear Regression for Hidden Markov Model Adaptation

Frank Wallhoff, Daniel Willett, Gerhard Rigoll

Gerhard-Mercator-University Duisburg, Germany

In the context of continuous Hidden Markov Model (HMM) based speech-recognition, linear regression approaches have become popular to adapt the acoustic models to the specific speaker's characteristics. The well known Maximum Likelihood Linear Regression (MLLR) and Maximum A Posteriori Linear Regression (MAPLR) are just two of them, which differ primarily in the training objective they are maximizing. However, besides the approaches mentioned above there exists another known training objective which is the Maximum Mutual Information (MMI). By combining this MMI-approach with the linear regression of the HMM's mean values, our research group developed a new adaptation technique that we call Scaled Likelihood Linear Regression (SLLR). In this approach, the distance of the correct model sequence against the wrong ones is discriminated framewise. Like all techniques using MMI objectives, this adaptation is computationally very expensive compared to techniques using ordinary ML based objectives. This paper therefore addresses the problem of an appropriate approximation technique to speed up this adaptation approach, by pruning the computation for tiny values in the discrimination objective. To further explore the potential of this adaptation technique and its approximation, the performance is measured on the LVCSR-system DUDeutsch developed by our research group at the Duisburg University and additionally on the 1993 WSJ adaptation tests of native and nonnative speakers for the supervised case.

Full Paper

Bibliographic reference.  Wallhoff, Frank / Willett, Daniel / Rigoll, Gerhard (2001): "Scaled likelihood linear regression for hidden Markov model adaptation", In EUROSPEECH-2001, 1229-1232.