Deep Markov Model - Implementation

Hello all,

I am trying to tweak the DMM implementation in pyro with a different dataset where the observations are continuous values rather than binary(in the case of polyphonic music dataset that has been used in the current implementation). In the current implementation, for the emitter logic, they have used Bernoulli distribution to model the binary observations and for the transition logic, they have used Normal distribution to model the latent variables. This link gives some details about the code.

Things tried:
In my implementation, I cannot use Bernoulli, I have tried using Normal distribution to model continuous observation values. To do this, I have implemented something similar to the transition logic to get the loc and scale for the Normal distribution. I mean I used the same transition architecture but passed the current z_t.

But the Negative log-likelihood is not decreasing much. After around 12 epochs, it throws an error “The parameter scale has invalid values” and the NLL is increasing too much. I set the validate_args to False. With this, there is no error, but NLL values are NaN. Also, there is no error when the z_dim is less than 5.

What could be the cause of this error after a few epochs?

Can we use the emitter logic similar to transition logic for a non-binary observations dataset?

@jpchen @martinjankowiak @fritzo @eb8680 @osazuwa

Thanks in advance.

you can use any observation likelihood you choose. the trouble is that this class of models can be tricky to optimize, especially if the sequences are long. consequently it is generally important to tune hyperparameters: learning rates etc

Thank you, I will try tuning the hyperparameters and get back.