Publications (2)0 Total impact
ABSTRACT: In this paper we present some results from a net-like structure for Hidden Markov Models, applied to speech recognition. Net topology is a Recurrent Neural Network in which each temporary step is identified as a layer. Backpropagation techniques are used to train the RNN-HMM. Two types of training estimations are used: Maximum Likelihood and Competitive Training. Maximum Likelihood estimation algorithm using backpropagation provides the same updating equations as Baum-Welch algorithm used in HMM. Competitive Training is based on the probability of correct labelling the sequences from the Maximum Likelihood measures. Our results have shown that the best procedure is to train first with Maximum Likelihood estimation and then with Competitive Training reestimation.
04/2006: pages 361-369;
ABSTRACT: Nowadays a big research effort is being made in the development of CSR systems, both ANN-based and HMM-based. Up to now, the
HMM-based systems seem to have the best performance, although the ANN-based ones are being developed quicker than the HMM-based
due to the new topologies that are being tested with increasingly good results.
12/1994: pages 116-119;