Is there a way to obtain inducing inputs found in Variational Sparse GP?

Hi all,

I’m new to Pyro and I was able to run Variational Sparse GP on my Poisson likelihood data with one-dimensional input(it’s a Gaussian Cox Process, where I model the non-linear intensity function as GP).

But I have my hierarchical GP model built in Stan and I was thinking it would be nice if I can use the inducing inputs found using pyro’s VSGP.(Not sure if hierarchical GP can be implemented in Pyro…)

Can it be done?
Thanks in advance!

I think named_pyro_params is what you are looking for. It returns all parameters of your GP module. Alternatively, you can just access it like access an attribute of a Python class.

I’m not sure what hierarchical GP means. If you want to define priors for parameters like lengthscale, variance, noise, inducing points,… then it is pretty straightforward (see GP tutorial).

1 Like

Thanks for the reply but I’m still quite unsure. Does that mean I shouldn’t build the model using the sample code provided in Pyro documentation?

# initialize the inducing inputs
Xu = (torch.arange(120.) / 8.).float()
X = X.float() # 473 integer numbers ranging from 1 to 13
y = y.float() # 473 Poisson observations at each X values
# initialize the kernel, likelihood, and model

kernel = gp.kernels.RBF(input_dim=1)
# GP priors : When used, returns singular U(I guess covariance matrix?)
# kernel.variance = pyro.nn.PyroSample(dist.LogNormal(0, 1))
# kernel.lengthscale = pyro.nn.PyroSample(dist.LogNormal(0, 1))

likelihood = gp.likelihoods.Poisson()
vsgp = gp.models.VariationalSparseGP(X, y, kernel, Xu=Xu, likelihood=likelihood)

hmc_kernel = HMC(vsgp.model)
mcmc = MCMC(hmc_kernel, num_samples=10)

posterior_sample = mcmc.get_samples()

The posterior sample returns the log intensity function values at the optimized inducing inputs. Is that correct? Can the latter method be used here?
Please excuse my ignorance, I am pretty new to Python in general and googling doesn’t seem to be very helpful :frowning:

The Variational…GP classes are intended to use variational inference (rather than MCMC), so I guess using MCMC here will fail. With some ugly tricks and implementation details, you can make it work:

  • set u_scale_tril to None
  • block the latent site u
  • set prior for u_loc and perform inference for u_loc (rather than u)

I won’t recommend to follow that path. I think a better solution would be to re-implement the model method. We have implemented all the conditional math, so all you need is to mimic it and replace:

f_loc, f_var = conditional(..., self.u_loc, self.u_scale_tril, ...)


u = pyro.sample("u", dist.Normal(0, 1).expand([num_inducing]).to_event())
f_loc, f_var = conditional(..., u, None, ...)