The Hidden Markov Model is a finite set of states, each of which is
associated with a (generally multidimensional) probability distribution
[]. Transitions among the states are governed by a set of probabilities
called transition probabilities. In a particular state an
outcome or observation can be generated, according to the
associated probability distribution. It is only the outcome, not the
state visible to an external observer and therefore states are
``hidden'' to the outside; hence the name Hidden Markov Model.
In order to define an HMM completely, following elements are needed.
displaymath2614
where
tex2html_wrap_inline2616 denotes the current state.
Transition probabilities should satisfy the normal stochastic
constraints,
displaymath2618
and
displaymath2624
where tex2html_wrap_inline2626 denotes the tex2html_wrap_inline2628 observation symbol in the
alphabet, and tex2html_wrap_inline2630 the current parameter vector.
Following stochastic constraints must be satisfied.
displaymath2632
and
If the observations are continuous then we will have to use a continuous probability density function, instead of a set of discrete probabilities. In this case we specify the parameters of the probability density function. Usually the probability density is approximated by a weighted sum of M Gaussian distributions tex2html_wrap_inline2638 ,
where,
tex2html_wrap_inline2646 should satisfy the stochastic constrains,
displaymath2648
and
displaymath2654
Therefore we can use the compact notation
displaymath2656
to denote an HMM with discrete probability distributions, while
displaymath2658
to denote one with continuous densities. .