So I try to infer mean of a Gaussian distribution with fixed scale using variational inference
# hyperparameter for prior
loc_loc_0 = 30
loc_scale_0 = loc_scale
def model(data):
# prior of loc
z_loc = pyro.sample('z_loc', dist.Normal(loc=loc_loc_0, scale=loc_scale_0))
with pyro.iarange('observe_data', size=len(data)) as ind:
pyro.sample('obs', dist.Normal(loc=z_loc, scale=scale),
obs=data.index_select(0, ind))
def guide(data):
# register the one variational parameters with Pyro.
variational_loc = pyro.param("variational_loc", torch.tensor(100.0))
pyro.sample('z_loc', dist.Normal(loc=variational_loc, scale=loc_scale_0))
Once I learn the variational_loc
, In order to generate samples from the learnt model, I need to write another stochastic program to use this learnt variational_loc
def learnt_model():
loc = pyro.sample('infered_loc', dist.Normal(loc=pyro.param('variational_loc'), scale=1))
return pyro.sample('obs', dist.Normal(loc=loc, scale=loc_scale))
My question is:
How to reuse the original model()
instead of writing a new similar learnt_model()
?
My thinking was along the line of using pyro.condition with pyro.iarange. Something similar to this:
def model(x):
... s = pyro.param("s", torch.tensor(0.5))
... z = pyro.sample("z", dist.Normal(x, s))
... return z ** 2
@pyro.condition(data={"z": 1.0})
... def model(x):
... s = pyro.param("s", torch.tensor(0.5))
... z = pyro.sample("z", dist.Normal(x, s))
... return z ** 2
Instead of conditioning on z i want to condition a number of iid data points to create a model that we will run VI on together with a guide. But I do not know exactly how I should implement this?
Thank you for your time!