Mini batching with Bayesian GPLVM

Yeah, I think we need it to make the inference works. @vr308 Could you try to add scale handler for X_minibatch in both model and guide

with pyro.poutine.scale(scale=X.size(0) / batch_size):
    X_minibatch = pyro.sample("X", ...)

and use svgp = VariationalSparseGP(..., num_data=X.size(0))? You might also want to use TraceMeanField_ELBO like in dkl example if the training process is highly unstable.