A Recursive form of the Markov Model (RMM) is defined in which every state itself can be modeled by a Markov Model. It is a hierarchic model that can be used to describe analytically the structure of human speech or all possible levels: syntactic- semantic, lexical phonetic, sub-phonetic. This model is a generalization of the Hidden Markov Model. A Recursive form of the Forward-Backward  algorithm is derived which can be used to do training with the new model. A new scaling approach is introduced to make state pruning possible, and to prevent numerical underflow problems without using logarithms (as usual in HMM). The same improvement is possible with the Viterbi algorithm  for recognition. To show the relation with a conventional Hidden Markov Model, an example RMM is given that is shown to be identical with an HMM. Also simulation results an given according to a practical useful example. keywords: Speech Recognition, Hidden Markov Model, Forward-Backward algorithm
Bibliographic reference. Nijtmans, J. J. (1991): "A new recursive Markov model with a new state pruning approach for large vocabulary continuous speech recognition", In EUROSPEECH-1991, 659-663.