# Constraint to the sigma's support in Bayesian Regression Tutorial

I’m running through the bayesian regression 1 tutorial.

Tutorial says:

To look at the distribution of the latent parameters more clearly, we can make use of the `AutoDiagonalNormal.quantiles` method which will unpack the latent samples from the autoguide, and automatically constrain them to the site’s support (e.g. the variable `sigma` must lie in `(0, 10)` ).

The resulting point estimate of sigma is -2.2371 in param store, but with this constraint, median value of sigma in guide.quantile is 0.9647.

``````guide.requires_grad_(False)
for name, value in pyro.get_param_store().items():
print(name, pyro.param(name))

# AutoDiagonalNormal.loc Parameter containing:
# tensor([-2.2371, -1.8097, -0.1691,  0.3791,  9.1823])
# AutoDiagonalNormal.scale tensor([0.0551, 0.1142, 0.0387, 0.0769, 0.0702])

guide.quantiles([0.25, 0.5, 0.75])

# {'sigma': [tensor(0.9328), tensor(0.9647), tensor(0.9976)],
# 'linear.weight': [tensor([[-1.8868, -0.1952,  0.3272]]),
#  tensor([[-1.8097, -0.1691,  0.3791]]),
#  tensor([[-1.7327, -0.1429,  0.4309]])],
# 'linear.bias': [tensor([9.1350]), tensor([9.1823]), tensor([9.2297])]}
``````

Wonder the formula/mechanism of this automatic constraint to the sigma’s support.

i believe it should be the softplus function which is given by `softplus(x) = log(1 + exp(x))`