GPRegression: lower limit on lenghtscale

Hi, we have experimental observations where it is physically impossible to have a “lengthscale” below a certain value. However, when I try to add this limit to kernel priors,

kernel.set_prior(
        "lengthscale",
        dist.Uniform(
            torch.tensor(lscale[0]),
            torch.tensor(lscale[1])
        ).to_event()
    )

, where lscale = [5., 20.] for 1d or lscale = [[5., 5., 5.,], [20., 20., 20.]] for 3d, and then run SparseGPRegression with this kernel, it doesn’t optimize the lengthscale parameter during the SVI steps. It just stays at lscale[0] value (while amplitude and noise get optimized). If I change the lower limit to any value below 1., something like .99, then it starts optimizing the lengthscale parameter as well. I tried it with RBF, Rational Quadratic and Matern kernels and it is always the same. I am wondering how then I can put a lower limit on the lengthscale parameter (which is based on my knowledge about the physical system that I study) ? Thanks!

Hi @ziatdinovmax, how about setting constraints to your lengthscale

kernel.set_constraint("lengthscale", constraints.interval(5., 20.))

to see if it works?

If I change the lower limit to any value below 1., something like .99, then it starts optimizing the lengthscale parameter as well

It is likely a bug but I have no idea why. I’ll test this issue this weekend. Maybe you can try first by setting similar prior for gp example (Gaussian Processes — Pyro Tutorials 1.8.6 documentation)?

1 Like

Hi @fehiepsi, thanks for your quick reply! The suggested option seems to produce the same “bug”. Here’s a link to reproducible colab notebook with synthetic data:

If I specify, say, (1., 20.) interval, it gets stuck at 1.0 and makes poor “predictions” on test data. But if I change to (.99, 20.), it optimizes the length properly and produces good results. Maybe I am missing something obvious…

1 Like

Thanks @ziatdinovmax! The issue is that the default initial value 1.0 is out-of-support. I believe that just changing kernel’s lengthscale default value will solve this issue.

1 Like

@fehiepsi thanks!