Computational Intelligence, SS08
2 VO 442.070 + 1 RU 708.070 last updated:
General
Course Notes (Skriptum)
Online Tutorials
Practical Course Slides
Homework
Exams
Animated Algorithms
Interactive Tests
Machine Learning
Neural Networks
Classification Algorithms
Adaptive Filters
Gaussian Statistics
Hidden Markov Models
Key Definitions
Downloads
Literature and Links
News
mailto:webmaster
 Interactive Tests   previous   Contents

Hidden Markov Models

The parameters of a Markov model (NOT a hidden Markov model) are:
The set of states.
The prior probabilities (probabilities to start in a certain state).
The state transition probabilities.
The emission probabilities.
Find the correct statements.
The (first-order) Markov assumption means that the probability of an event at time $n$ only depends on the event at time $n-1$.
An ergodic HMM allows transitions from each state to any other state.
For speech recognition usually ergodic HMMs are used to model phoneme sequences in words.
Viterbi algorithm
The Viterbi algorithm finds the most likely state sequence for a given observation sequence and a given HMM.
The Viterbi algorithm finds the most likely state sequence for a given HMM.
The Viterbi algorithm computes the likelihood of an observation sequence with respect to an HMM (considering all possible state sequences).
In the Viterbi algorithm at each time step and for each state only one path leading to this state (the surviver path) and its metric are stored for further processing.