Incorporating GP regression into a larger generative model

I’m trying to express a generative model where gaussian processes are interspersed with parametric distributions. I followed along with the tutorial here, but didn’t see a clear way to adapt the implementation to what I’m trying to do.

Here’s some pseudo code that I hope will clarify the kind of generative process I’m trying to express.

f ~ GaussianProcess()

for i in range(n):
    U[i] ~ Uniform(0,1)
    X[i] ~ f(U[i])
    Y[i] ~ Uniform(X[i], X[i] + 1)

observe(Y == y_obs)

Thanks!

Hi @switty, could you please clarify notions such as X[I], U_i so I don’t misunderstand your model? In the mean time, if I understand correctly then your model is

f ~ GaussianProcess()

X ~ Uniform(0, 1, shape=n)
Y ~ f(X)
Z ~ Uniform(Y, Y + 1)

observe(Z == z_obs)

If that is the case then you can define a similar model as follows

X = torch.nn.Parameter(...)
gpr = gp.models.GPRegression(X, y=None, kernel=kernel)
gpr.set_prior("X", dist.Uniform(torch.zeros(n), 1))
def model(z_obs):
    f_loc, f_var = gpr.model()
    Y = pyro.sample("Y", dist.Normal(f_loc, f_var.sqrt()).to_event())
    pyro.sample("Z", dist.Uniform(Y, Y+1).to_event(), obs=z_obs)

You can take a look at GPLVM tutorial to train your model.

In case you have a large generative model for X (assume that X is not generated from a simple Uniform prior but from a larger probabilistic model), then you can do as follow

def model(z_obs):
   X = generative_model(...)
   gpr.set_data(X, None)
   f_loc, f_var = gpr.model()
   ...

Apologies for the ambiguity in my original model. I’ve now edited it to fix the typos.

This looks very promising, thank you!