Draw module from NUTS posterior

Say I have some model that contains a lifted module:

def model(X, module, prior):
    lifted = pyro.random_module('module', module, prior)()
    f = lifted(X)

def conditioned_model(X, y, module, prior):
    return pyro.poutine.condition(model, data={'f': y})(X_train, module, prior)

nuts_kernel = NUTS(conditioned_model, step_size=1e-2, adapt_step_size=True)
posterior = MCMC(nuts_kernel, num_samples=500, warmup_steps=500)
posterior = posterior.run(model, X_train, y_train, module, prior)

Is there a way now to use the posterior to generate samples of the module?

The Bayesian regression tutorial shows a way to do it with VI, but I haven’t been able to find an example using mcmc.

pyro.random_module registers the distribution parameters for the weights and biases of the NN with the global param store, and uses the current parameter values to sample the weights and biases of the NN. While HMC/NUTS does not need to register parameters with the param store, the same technique should still work except that we will be generating the samples in the integrator and using the prior distribution to score these samples.

I will give it a try tomorrow, and confirm if this works as expected. I think that it will, but let me know if you have faced any issues. I think the bigger issue would be around scalability of generating samples from the integrator for even moderately sized neural networks.

It seems to work during sampling, as in it runs. However, doing it this way also seems to result in a memory leak that doesn’t happen when I do priors over the parameters directly without using a module.

My question though was, assuming I’ve finished sampling, how would I generate examples of the module from the posterior?

My question though was, assuming I’ve finished sampling, how would I generate examples of the module from the posterior?

You should be able to record the samples from the posterior distribution of the weights/biases using EmpiricalMarginal, or just extract them from the trace as you sample. If you mean the distribution of the output from the NN, you can use the TracePredictive class provided that the NN’s output corresponds to the return value of the model. Otherwise, you might just have to constrain your NN to samples of the weights/biases generated and get an empirical distribution over the output.

Ok. I’d been sampling the parameters from EmpiricalMarginal, but I was hoping for a way to sample the module instead of having to assign the parameters manually.