Simple question about minimal example for

When I try

import torch
import as gp
kernel = gp.kernels.RBF(1)
likelihood = gp.likelihoods.Gaussian()
X = torch.tensor([1., 2., 3., 4., 5.]).double()
gpmodel = gp.models.VariationalGP(X, None, kernel, likelihood)
gpmodel.set_data(X, None)

I invariably get

(tensor([0., 0., 0., 0., 0.], dtype=torch.float64, grad_fn=<AddBackward0>),
 tensor([1., 1., 1., 1., 1.], dtype=torch.float64, grad_fn=<SumBackward1>))

Is that the expected result @fehiepsi?

Also, I was looking through the tutorials (Gaussian Processes — Pyro Tutorials 1.7.0 documentation) and noticed that even though the lengthscale is set to 10, the draws from the prior look like white noise:

I would have expected much smoother draws from the prior with that lengthscale.

I think so. Currently, when y is None, the GP models will return prior mean/variance (corresponding to the return values 0/1) of the latent f. For a generative GP model, I feel more natural to return a distribution Normal(loc, scale) given an input. For example, the forward method will also return mean and variance. You can take an extra step to draw a sample from the Normal distribution if you want.

expected much smoother draws from the prior

I guess this is because the noise is too large?