Prediction with Hidden Markov Model (HMM) examples


#1

Hi,

thank you for the great examples on how to perform inference in a hidden markov model with pyro (in pyro/examples/hmm.py)

I wonder what the best practice is to perform the following task:
Run the HMM on some input data for t steps.

  1. Infer the probabilities for the hidden states and observations for t+1
  2. Infer the probabilities for some of the observations conditioned on a given subset of observations.

Thank you and kind regards,
Jan


#2

Hi @jan,
you can infer probabilities I believe using TraceEnum_ELBO.compute_marginals(). To condition on previous observations, you’ll probably either need to pass in data with some Nones into the model() or use poutine.condition to set those observations.

Note that Pyro’s dynamic programming is intended to be used on entire time series at once, typically inside an SVI or HMC training loop where we’re training over multiple time series. Pyro’s implementation does not save state while sequentially running through data, so it won’t be a great solution for e.g. predicting the next state in a control problem (you’d instead use Pyro to train an amortized guide, e.g. a neural net and use that neural net for prediction). We’re working on making it easier to do filtering-style prediction in Pyro, but it’s a long way from making it into a Pyro release.