How to infer inputs of a neural network via MCMC?

Also, could you elaborate on what you meant by ... before def forward(self, x) ? do you mean I should do something like pyro.module('my_model', self.my_neural_net )?

That depends on your problem. If your neural network’s weights are fixed, there’s no need to do anything Pyro-specific, just define your PyTorch __init__ method that builds self.my_neural_net.

my model is outputting a likelihood score and I want MCMC to operate on that

If you want to do MCMC directly on a custom unnormalized log-density rather than a Pyro model with pyro.sample statements, you can pass a function that computes your density given parameter values to the potential_fn argument of the MCMC kernel, as illustrated in this forum thread.

When returning a sample in the forward() function you wrote, how can I clamp my samples to be between [0, 1]?

Do you mean samples of "x"? If so, you can apply a SigmoidTransform to the distribution of "x" via pyro.distributions.TransformedDistribution. Alternatively, you could use a Bernoulli likelihood and return the mean rather than binarized samples as in the VAE example.

1 Like