Training Hidden Markov Models with Multiple Observations-A Combinatorial Method

  title={Training Hidden Markov Models with Multiple Observations-A Combinatorial Method},
  author={Xiaolin Li and Marc Parizeau and R{\'e}jean Plamondon},
  journal={IEEE Trans. Pattern Anal. Mach. Intell.},
Hidden Markov models (HMM) are stochastic models capable of statistical learning and classification. They have been applied in speech recognition and handwriting recognition because of their great adaptability and versatility in handling sequential signals. On the other hand, as these models have a complex structure and also because the involved data sets usually contain uncertainty, it is difficult to analyze the multiple observation training problem without certain assumptions. For many years… 

Figures from this paper

Training Second-Order Hidden Markov Models with Multiple Observation Sequences

This article introduces a new HMM2 with multiple observable sequences, assuming that all the observable sequences are statistically correlated, and shows that the model training equations can be easily derived with an independence assumption.

Learning discrete Hidden Markov Models from state distribution vectors

A new polynomial-time algorithm for supervised learning of the parameters of a first order HMM from a state probability distribution (SD) oracle and a hybrid learning algorithm for approximating HMM parameters from a dataset composed of strings and their corresponding state distribution vectors are developed.

Modified Baum Welch Algorithm for Hidden Markov Models with Known Structure

Several approaches for modifying the Baum Welch Algorithm are shown and the results of all training methods are compared.

A novel training method for HMM2 with multiple observation sequences

A novel training method for HMM2 with multiple observable sequences, assuming that all the observable sequences are driven by a common hidden sequence, and building up an associated objective function using Lagrange multiplier method.

Generalized multi-stream hidden markov models

This dissertation developed, implement, and test multi-stream continuous and discrete hidden Markov model (HMM) algorithms that are validated on various applications including Australian Sign Language, audio classification, face classification, and more extensively on the problem of landmine detection using ground penetrating radar data.

A generalized hidden Markov model and its applications in recognition of cutting states

A generalized hidden Markov model (GHMM) in the context of generalized interval probability theory is proposed, which provides a concise representation for the two kinds of uncertainty simultaneously.

Hidden Markov Models Training Using Hybrid Baum Welch - Variable Neighborhood Search Algorithm

The results show that the VNS-BWA has better performance fifinding the optimal parameters of HMM models, enhancing its learning capability and classifification performance.

A survey of techniques for incremental learning of HMM parameters


An extension of the Hidden Markov Model is developed that addresses two of the most important challenges of financial time series modeling: non-stationary and non-linearity and includes a novel exponentially weighted Expectation-Maximization (EM) algorithm to handle these two challenges.



A New Algorithm for Automatic Configuration of Hidden Markov Models

This paper presents a procedure that addresses the problem of automatically configuring HMM's with the following advantages: better convergence characteristics than the standard Baum-Welch algorithm, automatic reduction of model size to the right complexity fit, better generalization, and relative insensitivity to the initial model size.

Smooth On-Line Learning Algorithms for Hidden Markov Models

A simple learning algorithm for Hidden Markov Models (HMMs) is presented together with a number of variations, proved to be exact or approximate gradient optimization algorithms with respect to likelihood, log-likelihood, or cross-entropy functions, and as such are usually convergent.

Sophisticated topology of hidden Markov models for cursive script recognition

An adaptation of hidden Markov models (HMM) to automatic recognition of unrestricted handwritten words and many interesting details of a 50,000 vocabulary recognition system for US city names are described.

A comparison between continuous and discrete density hidden Markov models for cursive handwriting recognition

The surprising result of the investigation was the fact that discrete density models led to better results than continuous models, although this is generally not the case for HMM-based speech recognition systems.

HMM Based On-Line Handwriting Recognition

A more sophisticated handwriting recognition system that achieves a writer independent recognition rate of 94.5% on 3,823 unconstrained handwritten word samples from 18 writers covering a 32 word vocabulary is built.

An introduction to the application of the theory of probabilistic functions of a Markov process to automatic speech recognition

This paper presents several of the salient theoretical and practical issues associated with modeling a speech signal as a probabilistic function of a (hidden) Markov chain, and focuses on a particular class of Markov models, which are especially appropriate for isolated word recognition.

A speaker-independent, syntax-directed, connected word recognition system based on hidden Markov models and level building

This paper shows how to integrate efficient and accurate speech modeling methods and network search procedures to give a speaker-independent, syntax-directed, connected word recognition system which requires only a modest amount of computation, and whose performance is comparable to that of previous recognizers requiring an order of magnitude more computation.

A Fast Statistical Mixture Algorithm for On-Line Handwriting Recognition

A probabilistic framework suitable for the derivation of a fast statistical mixture algorithm for automatic recognition of unconstrained handwriting is developed, and both writer-dependent and writer-independent recognition results are found to be competitive with their elastic matching counterparts.

An HMM/MLP Architecture for Sequence Recognition

A hybrid architecture of hidden Markov models (HMMs) and a multilayer perceptron (MLP) that exploits the discriminative capability of a neural network classifier while using HMM formalism to capture the dynamics of input patterns.

Recognition of handwritten word: first and second order hidden Markov model based approach

  • A. KunduYang HeP. Bahl
  • Computer Science
    Proceedings CVPR '88: The Computer Society Conference on Computer Vision and Pattern Recognition
  • 1988
The handwritten word recognition problem is modeled in the framework of the hidden Markov model (HMM) and the Viterbi algorithm is used to recognize the sequence of letters consisting the word.