Pyro GP mcmc result missing and Markov Chain thinning

I am trying to use MCMC function to infer a GP regression problem with the following code:

calculate computational time

from pyro.infer import MCMC, NUTS, Predictive

st = time.time()

pyro.clear_param_store()

#generate a RBF kernel
RBF_kernel = gp.kernels.RBF(input_dim=1)
RBF_kernel.set_prior(“variance”, dist.Uniform(torch.tensor(1.), torch.tensor(500)))
RBF_kernel.set_prior(“lengthscale”, dist.Uniform(torch.tensor(1.), torch.tensor(3000)))

#generate a GP gression model conditioned on RSL data
gpr = PSTHM.GPRegression_V(X, y, RBF_kernel,noise=torch.tensor(y_sigma**2),jitter=1e-5)
gpr = gpr.double()
hmc_kernel = NUTS(gpr.model)
mcmc = MCMC(hmc_kernel, num_samples=2500,warmup_steps=500)
mcmc.run()

The code works well and gives me a nice result. If I print kernel.variance, it is a random variable with the posterior distribution, here an output example:

However, when I ran a forward GP regression for my new X data like:
gpr.kernel.variance
tensor(129.2338)

x_test = torch.arange(-500,2025,5.,requires_grad=True)
y_mean2, y_var2 = gpr(x_test.double(), full_cov=True,noiseless=True)

The kernel variance is now like:
gpr.kernel.variance
tensor(480.2222, dtype=torch.float64, grad_fn=)

I am wondering why is that? And if I want to thining the mcmc samples, let’s say selecting the first point for every five point of Markov Chain, how can I self-define a new regression based on thinned mcmc samples?

By default, the forward method will use parameters drawn from approximated inference (SVI). For mcmc, you need to use Predictive as in this tutorial Inferences for Deep Gaussian Process models in Pyro | fehiepsi's blog

Hi, thanks for this informaiton! But the Predictive function only return back the mean prediction, how can I find the covariance function for all ensembles?

Preditive will return all sample/deterministic sites for you. Just follow the tutorial and add a sample statement for your covariance.