Constrain parameter positive (like sigma in N(loc,sigma)) with AutoDiagonalNormal

Hi, I am a bit confused that how to keep sigma positive when we estimate the sigma like below pyro docu. (Bayesian Regression - Introduction (Part 1) — Pyro Tutorials 1.8.6 documentation) using AutoDiagonalNormal. In my opinion, only making prior of sigma~U(0,1) cannot keep the posterior positive. Should we use trunc.norm guide or other? Thank you very much!

from pyro.nn import PyroSample
class BayesianRegression(PyroModule):
def init(self, in_features, out_features):
super().init()
self.linear = PyroModule[nn.Linear](in_features, out_features)
self.linear.weight = PyroSample(dist.Normal(0., 1.).expand([out_features, in_features]).to_event(2))
self.linear.bias = PyroSample(dist.Normal(0., 10.).expand([out_features]).to_event(1))

def forward(self, x, y=None):
    sigma = pyro.sample("sigma", dist.Uniform(0., 10.))
    mean = self.linear(x).squeeze(-1)
    with pyro.plate("data", x.shape[0]):
        obs = pyro.sample("obs", dist.Normal(mean, sigma), obs=y)
    return mean

Hi @everli, under the hood, latent variables are transformed to their unconstrained spaces, and a Normal guide is constructed to approximate the posterior of “unconstrained variables” given data. In other words, the output of Normal guide will be transformed back to the variables’ support. So assuming that a sample of normal guide gives you (unconstrained)sigma = -10, the actual value of sigma will be exp(-10).

2 Likes

Hi @fehiepsi, many thanks. I found one line code "scale_constraint =constraints.softplus_positive(pyro.infer.autoguide.guides — Pyro documentation). Does this mean that pyro uses softplus to constrain parameters positively.
Another question is that can we build guides in which some parameters (e.g. Neural net. ) using Autoguide and some para. (e.g. sigma in N(loc, sigma)) using a manual guide. I am not sure we can realize it because Autoguide makes all para. as specific distributions. Thanks again!

You’re right. In the master branch, Pyro will use softplus as the default transform for the scale parameter. To use Autoguide for part of the model, you can use block handler to expose/hide some latent sites in your model

auto_guide = AutoDiagonalNormal(pyro.poutine.block(model, ...))
def guide(data):
    auto_guide(data)
    # manual guide for remaining variables

Hi @fehiepsi, thanks for your replying. I used block handler in autoguide and found it works well.

I am still a bit confused that you said " …a sample of normal guide gives you (unconstrained)sigma = -10 , the actual value of sigma will be `exp(-10)". I also find the discussion between exp()and softplus (Softplus transform as a more numerically stable way to enforce positive constraint · Issue #855 · pyro-ppl/numpyro · GitHub). I want to confirm that if a prior of variable ~U(0,10), then both of two samples of a normal guide like -0.5 and 0.5 will be transformed as exp(-0.5) and exp(0.5) or just negative one exp(-0.5). I am sorry that I don’t find source code to explain it.
Thank you very much.

Hi @everli, softplus transform is used for the scale parameter of the guide. We still use exp transform for positive support latent sites. For bounded support like interval(0,10), the transform is the compose of affine(0, 10) and sigmoid. So given a value in unconstrained space, we use sigmoid to transform it to a value in (0, 1), then multiply with 10 to get a value in (0, 10).

1 Like