How to use pyro.condition with pyro.iarange on iid data points

So I try to infer mean of a Gaussian distribution with fixed scale using variational inference

# hyperparameter for prior
loc_loc_0 = 30
loc_scale_0 = loc_scale

def model(data):
    # prior of loc
    z_loc = pyro.sample('z_loc', dist.Normal(loc=loc_loc_0, scale=loc_scale_0))   
    with pyro.iarange('observe_data', size=len(data)) as ind:
        pyro.sample('obs', dist.Normal(loc=z_loc, scale=scale),
                obs=data.index_select(0, ind))
        
def guide(data):
    # register the one variational parameters with Pyro.
    variational_loc = pyro.param("variational_loc", torch.tensor(100.0))
    pyro.sample('z_loc', dist.Normal(loc=variational_loc, scale=loc_scale_0))

Once I learn the variational_loc, In order to generate samples from the learnt model, I need to write another stochastic program to use this learnt variational_loc

def learnt_model():
    loc = pyro.sample('infered_loc', dist.Normal(loc=pyro.param('variational_loc'), scale=1))
    return pyro.sample('obs', dist.Normal(loc=loc, scale=loc_scale))

My question is:

How to reuse the original model() instead of writing a new similar learnt_model() ?

My thinking was along the line of using pyro.condition with pyro.iarange. Something similar to this:

 def model(x):
...     s = pyro.param("s", torch.tensor(0.5))
...     z = pyro.sample("z", dist.Normal(x, s))
...     return z ** 2

@pyro.condition(data={"z": 1.0})
... def model(x):
...     s = pyro.param("s", torch.tensor(0.5))
...     z = pyro.sample("z", dist.Normal(x, s))
...     return z ** 2

Instead of conditioning on z i want to condition a number of iid data points to create a model that we will run VI on together with a guide. But I do not know exactly how I should implement this?

Thank you for your time!

Once I learn the variational_loc, In order to generate samples from the learnt model, I need to write another stochastic program to use this learnt variational_loc.

One way to do this would be to separate the data from the model specification by not using the obs keyword, and instead conditioning on obs for learning the parameters in the guide.

def model(N):
    # prior of loc
    z_loc = pyro.sample('z_loc', dist.Normal(loc=loc_loc_0, scale=loc_scale_0))   
    with pyro.iarange('observe_data', N):
        pyro.sample('obs', dist.Normal(loc=z_loc, scale=scale))

def conditioned_model(data):
    return poutine.condition(blocked_model, data={"obs": data})(len(data))

# learn params by conditioning on the data
svi = SVI(conditioned_model, guide...)
for _ in range(num_steps):
    svi.step(...)

If you are using Pyro’s dev branch, then you can run SVI to get a TracePosterior object that can be used to generate posterior samples from the latent variables or newly generated return values, as follows:

# sample 1000 data points from the posterior
svi = SVI(model, guide, num_samples=1000)
posterior = svi.run(data)
samples  = EmpiricalMarginal(posterior, sites=["z", "obs"]).get_samples_and_weights()[0]

If you are using Pyro’s current release, then you will need to extract the values from the trace directly and cannot call svi.run. In that case, you can do something like this:

# sample data from the posterior
latent_samples = []
observed_samples = []
for i in range(1000):
    trace = poutine.trace(model).get_trace(N)
    latent_samples.append(trace.nodes["z"]["value"])
    observed_samples.append(trace.nodes["obs"]["value"])

Thanks you! That was very helpful :slight_smile:

Hi, thanks again for answering me. Now I realize that if I want to use pyro.iarange() with subsample_size options. Your approach doesn’t work. In other words, is there a way to perform pyro.condition with subsampling option?

Thanks!

In other words, is there a way to perform pyro.condition with subsampling option?

What’s the error that you are getting? I think you’ll need to pass in the subsample size to the model, but otherwise, it should work.