I am learning HMM recently and got confused with the training problem (training model parameters and hidden state given outcome sequence).
As far as I know, both Viterbi learning and Baum-Welch (forward-backward algorithm ) are used to estimate model parameters and hidden state in an EM fashion. the forward-backward algorithm is used in E step to estimate transition / emmision probability. What about estimaing hidden state seuquence in E step? is it estimated using forward-backward algorithm as well or using Viterbi algorithm?
Really appreciate if anyone can share some insight.