probabilityhidden-markov-modelsviterbi

Define hidden markov model for word


I'm attempting to define a hidden markov model and predict if given sequence of words is correct using Viterbi algorithm ( https://en.wikipedia.org/wiki/Viterbi_algorithm ). In order to aid understanding I've attempted to define the model paramters :

The letters in the corpus are abbd. From this I've defined :

states : a,b,b,d

trans_p (transition probabilities) : 
There are
 a : 1/4
 b : 2/4
 d : 1/4

emit_p (emission probabilities) : 
 count(a->b) / count(a) = 1/1 = 1
 count(b->b) / count(b) = 1/2 = 1/2
 count(b->d) / count(b) = 1/2 = 1/2

Is above correct ?

Do I need to define transition probabilities for d ? Do I need to define emission probabilities for b->a & b->d ?

I also refer to : https://stats.stackexchange.com/questions/212961/calculating-emission-probability-values-for-hidden-markov-model-hmm which aided in defining emission probabilities.


Solution

  • I think you are confusing emission probabilities with transition probabilities. When defining an HMM, you need to define

    If they are in you corpus, I suppose that a,b and d are your observables, not your states. You need to define relevant states to complete your HMM. If you can observe the state, then your Markov model is not hidden, it's a plain Markov model and there is not need for the Viterbi algorithm