"do" operator after inferring parameter posteriors


I am curious if there is a recommended way to use “numpyro.handlers.do” on a model after inferring the posteriors of the model. The workflow in my mind is: build a model, infer the parameter posteriors using some data and MCMC, use the learned posteriors and numpyro.handlers.do to estimate an interventional distribution.

My current workaround is just to pass the posteriors to the model after inference, but this feels a bit dirty. I’ve tried to include some code below to show the workflow I have implemented currently.

def model(marriage, age, divorce, learned_params=None):
    if learned_params is not None:
        a = numpyro.sample("a", dist.Normal(0.0, 0.2))
        bM = numpyro.sample("bM", dist.Normal(0.0, 0.5))
        bA = numpyro.sample("bA", dist.Normal(0.0, 0.5))
        sigma = numpyro.sample("sigma", dist.Exponential(1.0))
        a = numpyro.sample("a", dist.Normal(learned_params["a"], 0.2))
        bM = numpyro.sample("bM", dist.Normal(learned_params["bM"], 0.5))
        bA = numpyro.sample("bA", dist.Normal(learned_params["bA"], 0.5))
        sigma = numpyro.sample("sigma", dist.Exponential(learned_params["sigma"]))

    A = bA * age
    M = bM * marriage
    mu = a + M + A
    numpyro.sample("obs", dist.Normal(mu, sigma), obs=divorce)

# Run MCMC
mcmc = MCMC(NUTS(model))
mcmc.run(random.PRNGKey(0), marriage, age, divorce)

# Get mcmc results and use do operator
intervened_model = numpyro.handlers.do(model, data={M: 5.})
numpyro.handlers.seed(intervened_model, np.random.randint(-1*10**10, 1*10**10)
                                                 )(None, None, None, mcmc.paramaters)


I’m not familiar with “interventional distributions”, could you explain what your hoping to get out as an end result here? From your code snippet, it seems you’re hoping to adjust the mean of your prior based using learned_parameters. That seems more along the lines of an SVI application than an MCMC one. Can you elaborate for my benefit?

If I’m understanding correctly, this is a simple multi-linear regression model with unknown inherent variance, i.e.

y = c + m_1 x_1 + m_2 x_2 \pm \sigma

Where you’re inferring parameters \theta = \{c, m_1, m_2, \sigma \}, and you’re then hoping to generate new samples aligned with the constraints of the posterior. If that’s correct, might have an easier time using the infer.Predictive object.

Apologies if I’ve misunderstood your goal here.