# Sampling Posterior from Autoguide

I am working on a very basic univariate linear regression and having a little confusion about sampling from the posterior.

Here is my code:

``````height, weight = torch.tensor(df['height'].values), torch.tensor(df['weight'].values)
xbar = weight.mean()

#Model: height ~ Normal(alpha + beta * weight, sigma)
def model(weight, height):
alpha = pyro.sample("alpha", dist.Normal(torch.tensor(178.), torch.tensor(20.)))
beta = pyro.sample("beta", dist.Normal(torch.tensor(0.), torch.tensor(10.)))
sigma = pyro.sample("sigma", dist.Uniform(0,50))

mu = pyro.deterministic("mu", alpha + beta * (weight - xbar))
with pyro.plate("plate"):
pyro.sample("height", dist.Normal(mu, sigma), obs = height)

guide = AutoDiagonalNormal(model)

n_steps = 1000
for step in range(n_steps):
svi.step(weight, height)
if step % 100 == 0:
print('.', end='')
``````

After inference, I’d like to sample the parameters. When I take the quantiles, the values make sense.

``````
guide.quantiles([0.25, 0.5, 0.75])
``````
``````{'alpha': [tensor(154.4493), tensor(154.6422), tensor(154.8350)],
'beta': [tensor(0.8780), tensor(0.9028), tensor(0.9276)],
'sigma': [tensor(4.8655), tensor(5.0128), tensor(5.1640)]}
``````

But when I sample or look at the base dict, the sigma parameter does not match the quantiles and is negative.

``````guide.get_posterior().sample((3,))
``````
``````tensor([[154.7828,   0.8699,  -2.2352],
[154.9283,   0.9046,  -2.2036],
[154.7986,   0.9121,  -2.1130]])
``````
``````guide.get_posterior().base_dist.__dict__
``````
``````{'loc': Parameter containing:
the sigma samples you are seeing are from the unconstrained space. to get samples from the constrained space (0, 50) you need to apply a transform. this can be done automatically using `Predictive`. please see e.g. this post.