Hi everyone,

I would like to post a question about the Bayesian Inference for Bayesian Regression model. I am confused about the Bayesian inference for the evaluation part. For the Bayesian prediction, it requires to maginlize out the parameters. But I am confused how Pyro did that. In details, for the model evaluation part,

predictive = Predictive(model, guide=guide, num_samples=800, return_sites=("linear.weight", "obs", "_RETURN")) samples = predictive(x_data) pred_summary = summary(samples)

If I want to do a Bayesian prediction, I need to compute p(y|x) and marginalize out all the parameters which are the ‘linear.weights’ here.

p(y|x) = ∫_{θ}p(y|x,θ)p(θ| D) dt

where p(θ| D) is the posterior probability distribution of the parameters you got using SVI. Since the integral is intractable, we can approximate it by

p(y|x) = (1/S)Σp(y|x,θ_{s}) where θ

_{s}~ p(θ|D)

I want to get the argmax p(y|x) and p(y|x) can be viewed as a mixture of Gaussian model. Does pyro provide a technique to approximate p(y|x) by marginalize all the parameters and then get the most likely y. I saw Pyro can make several samples of parameters, then for each parameter, we can generate a y. Finally, we can use the mean of all y to be the prediction. But I am afraid that the prediction got in this way does not maximize p(y|x). I am willing to know what can Pyro do for marginalizing out all the paramters

Thank you so much for everyone’s help.