Samples from prior distribution

Is there an elegant way to extract samples from the prior distributions in a pyro model? All the examples I’ve seen show the posterior samples using MCMC().get_samples(), and posterior predictive and prior predictive samples using the Predictive class?

if your model has return values you can directly call it and get those:

def model():
    pyro.sample("z", ...)
    x = pyro.sample("x", ...)
    return x

x = model()

of course this will only get you the returned site(s). if you want all the sites, you can use pyro.poutine.trace:

model_trace = pyro.poutine.trace(model).get_trace(model_args)

# inspect the structure of model_trace to pull out what you want, e.g.
for name, site in model_trace.nodes.items():
    if site["type"] == "sample":
        print(name, site["value"])
1 Like

You can also use Predictive as a convenience utility to draw samples from the prior by passing an empty dict to posterior_samples argument, which will essentially do what @martinjankowiak’s snippet above is doing. An additional advantage is that if all the batch dimensions are annotated correctly with pyro.plate, you can use parallel=True to draw a single vectorized sample which might be faster for more complex models.

def model(x, y=None):
  ...
  pyro.sample('y', dist.Normal(0., 1.), obs=y)


# draw 100 samples from the prior
prior_samples = Predictive(model, {}, num_samples=100)(x)
print(prior_samples)
1 Like

Hi,
Applying this approach (with a very simple model) I get the following error:
ValueError: The parameter loc has invalid values
My x values don’t seem to have any problem. Besides, using the call to predictive for the posterior samples works OK.

Does this ring a bell?

Thanks

Please post the model code, otherwise we won’t be able to help.

Hi,
I finally figured it out. My model has 3 random variables (a, b, c) which are combined as

a * log (b * data + c)

It seems that I was obtaining negative values for the log argument. Clamping the range of b * data + c solved the issue.

I am still puzzled about the fact that the posterior did not raise any issue.

Thanks.

Hi everyone,

I hope it’s okay to jump in on this old topic… I have a super basic follow-up question related to this. Suppose I have something like…

def model(var=torch.tensor(1.0)):
    variance = pyro.param("variance", var)
    x = pyro.sample("x", dist.normal(loc=torch.tensor(0.), scale=variance))

data = Predictive(model, {}, num_samples=1000)()

… how can I adjust this to forward-simulate the model with different (non-default) variance parameters? In other words, how can I pass parameters to the model when using Predictive?

I tried

data = Predictive(lambda: model(var=2.0), {}, num_samples=1000)()

and

def modified_model(**kwargs):
    return model(var=2.0, **kwargs)

data = Predictive(modified_model, {}, num_samples=1000)()

… but (surprisingly to me) neither seems to work. Both variations return simulated values from the model with the default parameter value.

I am also open to other approaches / constructions; I am simply looking for a clean way to sample from a (prior) model with different values for some hyperparameters. If there is an easier way to achieve this, I’d be more than happy to learn about it!

this should work fine if you clear the global parameter store appropriately between invocations, use a local param scope context manager to avoid param collision, etc

1 Like

Ahem, thanks a lot! I am now remembering that I ran into the exact same problem last year