How do I constrain a distribution using a second distribution?

x3_mean = 5e-4
x3_stddev = 1e-4

# Define a guide (variational distribution) for the latent variables
def guide(y):
    # Variational parameters for x3
    x3_loc = pyro.param("x3_loc", torch.tensor(x3_mean))
    x3_scale = pyro.param("x3_scale", torch.tensor(x3_stddev), constraint=dist.constraints.positive)
    x3 = pyro.sample("x3", dist.Normal(x3_loc, x3_scale))

  ### How do I set a constrain with respect to x3 ??????
    x4_loc = pyro.param("x4_loc", torch.tensor(x4_mean), constrain=dist.constrains.interval(1e-6, x3_loc))
    x4_scale = pyro.param("x4_scale", torch.tensor(x4_stddev), constrain=dist.constrains.interval(1e-6, x3_scale))
    x4 = pyro.sample("x4", dist.Normal(x4_loc, x4_scale))

I would like the mean and std of x4 to be always equal or smaller than x3.


You can do it by reparameterization, e.g.:

x4_scale_factor = pyro.param("x4_scale", torch.tensor(x4_stddev), constrain=constrains.unit_interval)
x4_scale = x3_scale * x4_scale_factor

This seems to work! And putting a constrain solves the convergence issue I had in my other post.

Thanks for your kind help @ordabayev. Pyro seems to be a really cool package. Looking forward to get to use it more!

1 Like