INTERSPEECH 2006 - ICSLP
We propose to use Minimum Divergence (MD) as a new measure of errors in discriminative training. To focus on improving discrimination between any two given acoustic models, we refine the error definition in terms of Kullback-Leibler Divergence (KLD) between them. The new measure can be regarded as a modified version of Minimum Phone Error (MPE) but with a higher resolution than just a symbol matching based criterion. Experimental recognition results show the new MD based training yields relative word error rate reductions of 57.8% and 6.1% on TIDigits and Switchboard databases, respectively, in comparing with the ML trained baseline systems. The recognition performance of MD is also shown to be consistently better than that of MPE.
Bibliographic reference. Du, Jun / Liu, Peng / Soong, Frank K. / Zhou, Jian-Lai / Wang1, Ren-Hua (2006): "Minimum divergence based discriminative training", In INTERSPEECH-2006, paper 1703-Thu2A1O.2.