 # Binomial Regression, pyro.get_param_store return point estimate

Hi! I am new to Pyro and I am trying to build a binomial regression model by modifying the introduction tutorial.
The model tries to fit n_trun ~ Binomial (total_count = n_read, p = y)
where y = Sigmoid(Linear(fr_gc, fr_u))

I used auto_guide to generate the guide function.

However, when I use `pyro.get_param_store().items()` to get the parameters, it does not have standard deviation, it return a point estimate

``````a 6.6135216
b_gc 0.2669241
b_u 0.06167726
``````

instead of those in the tutorial that has a loc and scales

``````AutoNormal.locs.a 2.213837
AutoNormal.scales.a 0.007959719
AutoNormal.locs.bA -0.25122392
AutoNormal.scales.bA 0.018179549
AutoNormal.locs.bR -0.02946356
AutoNormal.scales.bR 0.0051593184
AutoNormal.locs.bAR 0.039513554
AutoNormal.scales.bAR 0.009952072
AutoNormal.locs.sigma -4.4010673
AutoNormal.scales.sigma 0.05531578
``````

also, using pyro,plate to sample, returns nothing.

I was struggling for days and cannot find any way that solves this. Please help. Thanks

Here is my code.

``````def model(fr_gc, fr_u, n_read, n_trun=None):
a = pyro.param("a", dist.Normal(0., 10.)) # name, value (when initialize) to return a tensor, or a tensor
b_gc = pyro.param("b_gc", dist.Normal(0., 1.))
b_u = pyro.param("b_u", dist.Normal(0., 1.))

prob = torch.special.expit(a + b_gc * fr_gc + b_u * fr_u)
#     sigma = pyro.sample("sigma", dist.Uniform(0., 10.))

with pyro.plate("data", len(fr_gc)): # equivilant to for loops
return pyro.sample("obs", dist.Binomial(total_count = n_read,
probs = prob
), obs=n_trun)
#           return pyro.sample("obs", dist.Normal(prob, sigma), obs = n_trun)
# generates observed data from latent variables using primitive functions

%%time
pyro.clear_param_store()

# These should be reset each training loop.
auto_guide = pyro.infer.autoguide.AutoNormal(model) #pre-defined distribution over each hidden parameter

elbo = pyro.infer.Trace_ELBO()
svi = pyro.infer.SVI(model, auto_guide, adam, elbo) # we gonna use adam (SGD) to optimize elbo (SVI=stochastic variational inference)

losses = []
for step in range(1000 if not smoke_test else 2):  # Consider running for more steps.
loss = svi.step(fr_gc, fr_u, n_read, n_trun)
losses.append(loss)
if step % 100 == 0:
logging.info("Elbo loss: {}".format(loss))

``````

Instead of writing `a = pyro.param("a", dist.Normal(0., 10.))`, you should use the `pyro.sample` function. So write

``````a = pyro.sample("a", dist.Normal(0., 10.))
``````

and do the same for the other parameters