Fully Bayesian GP Regression in Pyro

I will like to implement a fully Bayesian Gaussian process regression in Pyro. Below is the target function and the synthetic data.

# The target function:
def f(x):
    return torch.sin(20*x) + 2*torch.cos(14*x) - 2*torch.sin(6*x)

# Generation of synthetic data set:
xs = torch.tensor([-1.,-0.5,0,0.5,1])
epsilon = torch.normal(0, 0.0004, size=5)
ys = f(xs) + epsilon
# The data set consists of xs and ys.

The kernel used is the RBF kernel:

k\left(x, x^{\prime}\right)=\sigma_{s}^{2} \exp \left(-\frac{\left\|x-x^{\prime}\right\|^2}{2 \sigma_{l}^{2}}\right).

The \sigma_{s}^{2} is a variance and the \sigma_{l}^{2} is a length scale. k\left(x,x^{\prime}\right) gives some measure of similarity. This is a fully Bayesian Gaussian process regression, so I choose the following priors on the parameters \sigma_{s}^{2} and \sigma_{l}^{2}: \sigma_{l}^{2} \sim \mathrm{Lognormal}(-1,1) and \sigma_{v}^{2} \sim \mathrm{Lognormal}(0,2).

How do I implement the above fully Bayesian regression model using the above data and kernel? Can you give me an example of a fully Bayesian GP regression in Pyro using NUTS to sample from the posterior using the above data and kernel?

For me the documentation is not sufficient, unfortunately. I have been working on this project for a long time with the goal of learning Pyro for GP regression—without any luck.

i suggest following the numpyro example.

pytorch-based implementations of hmc (like in pyro) are necessarily somewhat slow.

Thank, but I’m OK with it being slow. My objective is to learn Pyro (for different reasons).
I’m not familiar with numpyro.

your best bet is probably to look at this pyro tutorial to setup MCMC boiler plate and at the above-mentioned numpyro tutorial for an example of how to implement a GP.