Arbitrary mixture models and discrete latent variable enumeration


I am trying to do a mixture model using two distributions as the likelihood of my data: a Normal and an Exponential. I have read through the “Inference with discrete latent variables” and the “Gaussian mixtures” guides, and most of the mixture models forum entries (I think that “Vectorizable mixture models” is the most relevant). However I fail to understand how to use enumeration AND arbitrary likelihood functions of the data. It seems to me that all examples use mixtures coming from the same distribution and, therefore, it is simple to index the distribution parameters using the draw from the categorical distribution.

Is there a way to enumerate and use different distributions? if there are many ways to do so, what would be the most “Pyronic” way to do it?

Thank you,

You can use sequential (non-parallel) enumeration, but the cost will be exponential in the number of enumerated sample sites.

def model(data):
    i = pyro.sample("i", dist.Categorical(torch.tensor(0.5, 0.5)),
                    infer={"enumerate": "sequential"})
    assert i.shape == ()
    if i == 0:
        pyro.sample("x", dist.Normal(0, 1), obs=data)
        pyro.sample("x", dist.Exponential(1), obs=data)
1 Like