Namespace collision for params of GPRegression

I’ve got two GPRegression models that are trained separately and then fed into a higher level model to produce a final result. I’m running into parameter namespace collision for:

  • kernel.lengthscale
  • kernel.variance
  • noise

Other than sprinkling pyro.clear_param_store() throughout my code is there a technique for scoping parameters in a namespace for GPRegression? How do people deal with production model serving where I may have many models served from the same process?

Nevermind:

class TavgRBF(pyro.nn.PyroModule):

    def __init__(self, x, y, variance=None, lengthscale=None, noise=None):
        super(TavgRBF, self).__init__()
        self.tavg_kernel = gp.kernels.RBF(input_dim=1, variance=variance, lengthscale=lengthscale)
        self.tavg_gpr = gp.models.GPRegression(x, y, self.tavg_kernel, noise=noise)

    def train(self, num_steps=2500, patience=60, min_delta=0.05, lr=0.5, gamma=0.1):
        train_gp(self.tavg_gpr, num_steps=num_steps, patience=patience, min_delta=min_delta, lr=lr, gamma=gamma)


    tavg_rbf = TavgRBF(week_data, tavg_scaled, variance=torch.tensor(5.), lengthscale=torch.tensor(10.), noise=torch.tensor(1.))
    tavg_rbf.train()

    for key, value in pyro.get_param_store().named_parameters():
        print(f"{key}: {value}")

Results in:

early stopping at step 124 with loss tensor(2078.0801, grad_fn=<AddBackward0>)
tavg_gpr.kernel.lengthscale: 0.6094868183135986
tavg_gpr.kernel.variance: -0.7992058396339417
tavg_gpr.noise: -0.6146970391273499
1 Like

For future reference, if you have two PyroModules as part of your model, then they need to be attributes of a top-level PyroModule; if they are only attributes of a top level nn.Module then the pyro sample site names may conflict.