Question about SparseGPRegression Model in Pyro GP

The above forum topic has a full complete example on how to use model, guide pair to do mini-batch GPLVM. I think there are also other forum topics, where model, guide are used. Also, this notebook does SVI inference over a pair model and guide (an AutoGuide) for a GPLVM model.

Hi DU

My example program starts with your example the notebook. Could you make your example to use structured model and guide? This would be very helpful.

J.

I follow GPLVM to redefine a new one, see below. Do you think it is right?

# Define a differnt prior
class myGPLVM(Parameterized):
    def __init__(self, base_model, L):
        super(myGPLVM, self).__init__()
        if base_model.X.dim() != 2:
            raise ValueError("GPLVM model only works with 2D latent X, but got "
                             "X.dim() = {}.".format(base_model.X.dim()))
        self.base_model = base_model
        self.X = PyroSample(dist.MultivariateNormal(base_model.X.new_zeros(base_model.X.shape[1], base_model.X.shape[0]), precision_matrix = L ).to_event())
        self.autoguide("X", dist.MultivariateNormal)
        self.X_loc.data = base_model.X.t()

    @pyro_method
    def model(self):
        self.mode = "model"
        # X is sampled from prior will be put into base_model
        self.base_model.set_data(self.X.t(), self.base_model.y)
        self.base_model.model() 
        self.X_loc.data = self.base_model.X.t()
        
    @pyro_method
    def guide(self):
        self.mode = "guide"
        # X is sampled from guide will be put into base_model
        self.base_model.set_data(self.X.t(), self.base_model.y)
        self.base_model.guide()

    def forward(self, **kwargs):
        self.mode = "guide"
        self.base_model.set_data(self.X.t(), self.base_model.y)
        return self.base_model(**kwargs)

As it is not allowed to debug into PyroSample(), so I am not very sure what exactly has been done inside this call. It seems to me after
self.X = PyroSample(dist.MultivariateNormal(base_model.X.new_zeros(base_model.X.shape[1], base_model.X.shape[0]), precision_matrix = L ).to_event())
there is no attribute X to self.

-J

@junbin.gao Your solution looks much better than my suggestion. (it seems that the statement self.X_loc.data = self.base_model.X.t() is not needed in def model(self):?)

Could you make your example to use structured model and guide?

It is defined in Cell 11 of that notebook.

there is no attribute X to self

I am not sure. You can print(self.X) and print(self.X) in your self.model and self.guide to interpret its value. Basically, X = self.X is just a shortcut of

X = pyro.sample(some_name, the_prior_defined_in_PyroSample)

Unfortunately if I remove self.X_loc.data = self.base_model.X.t(), then X_loc does not change anymore? How to make X a variable?

Hmm, it is tricky. I don’t know why…

Hahaha, find the error, I missed optimiser.step(). Thanks!

But got a trouble with

loss = loss_fn(gplvm.model, gplvm.guide) + regularizer*reg_over(gplvm.X_loc)

which produces the following error:

RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling backward the first time.

How shall I add this regularizer reg_over(gplvm.X_loc) to the loss?

Here you are: pyro.factor

Thanks Due. Where shall I add this pyro.factor? In model definition? Shall I take log on this factor, or pyro will handle the log?

I think I am getting to understand pyro slowly.

J.

I think the best way to find an answer is to search for it, either in the documentation or in the forum. You can find how to use pyro.factor here or from the documentation in my last comment.