I don't know whether this is the correct forum for this but here goes:
I'm trying to implement a Hidden Markov Model to be able to predict and find the best sequence/path for a training file.
So far, I have the mel-frequency cepstral coefficients (MFCC) of a signal, and I am looking to train this so I can then compare (two data sets) using the Viterbi algorithm to find the best path. But I have some confusions:
If I estimate $\Pi$, can I then use the backward-forward algorithm to find the probabilities at a given state?
Instead of using the backward-forward algorithm, can I use the Viterbi algorithm to find the probability?
If I'm trying to identify two probable things (two outcomes) and
N = 2
what wouldM
be? I assume thatN
= states andM
= observations? But, the vector I have contains: N = 13, M = 450, so these values cannot therefore be used for N->M for the HMM.Do I therefore send the training data to the forward-backward algorithms, which predicts the probability for each state, then this will give me a final output... Which I can then compare with the Viterbi de-coder?