Train error in loss_and_grads in Bayesian Neural Net

I’ve been playing around with Pyro for about a week now but unfortunately I can’t get it to train properly. My setup is somewhat akin to the SS-VAE example, except that my approach is supervised.

My model sets priors and samples a multinomial likelihood, where the probabilities are softmaxed from Gaussian latent variables.

My guide contains a neural net and predicts mu and sigma for my latents, and also samples those latent variables too.

Like so:

    def model(self, input_batch, labels):
        batch_size = input_batch.size(0)
    
        # Sample from priors (generative side)
        with pyro.iarange('independent'):
            ...

            # Pre-binned histogram observed
            counts = torch.sum(labels, dim=1)  # Sum over target columns
            likelihood = pyro.distributions.multinomial(ps=probabilities, n=counts)
            
    def guide(self, input_batch, labels):
         ...

I have no trouble running the the model and guide. It fails after evaluating those (in loss_and_grads)

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-57-93fd337a0d5c> in <module>()
----> 1 train()
...

AttributeError: 'Variable' object has no attribute 'log_pdf'

Why does it somehow think the trace object is a Variable? Traceback attached.
Please let me know I am approaching this as intended. I’m using Pyro 0.1.2 and Pytorch 0.2.
Thanks!

There is a subtle bug here in how you’re using the distributions. pyro.distributions.multinomial(ps=probabilities, n=counts) returns a sample from the multinomial. If you want to instantiate a distribution then you need to use: pyro.distributions.Multinomial(ps=probabilities, n=counts). In an observe, you can do either of these things, both of which are equivalent:

pyro.sample("obs", dist.multinomial, parameters, obs=observation)
pyro.sample("obs", dist.Multinomial(parameters), obs=observation)

See here for more information on the functional/object forms of the distributions.

Also for your bayesian nn, you may want to consider using random_module which allows you to lift parameters into samples from a prior you provide.

Gotcha thanks so much!