Hello

I have my own model+data to infer parameters related with multinomial distribution but that doesnāt work.

So I tried to solve the SVI Part1 problem with multinomial to see if thatās better but itās like that doesnāt work neither so Iām trying to guess whatās wrong.

```
data=[]
for i in range(2000):
data.append(pyro.sample("obs_{}".format(i),pdist.Multinomial(10,probs=torch.Tensor([0.6,0.4]))))
def model(data):
alpha0 = 10
alpha1 = 10
f = pyro.sample("latent",pdist.Beta(alpha0, alpha1))
ok = torch.Tensor([f,1-f])
for i in range(len(data)):
pyro.sample("obs_{}".format(i), pdist.Multinomial(10,probs=ok),obs=data[i])
def guide(data):
qalpha0 = pyro.param("qalpha0",torch.Tensor([15]),constraint=constraints.positive)
qalpha1 = pyro.param("qalpha1",torch.Tensor([15]),constraint=constraints.positive)
pyro.sample("latent",pdist.Beta(qalpha0, qalpha1))
adam_params = {"lr": 0.001, "betas": (0.9, 0.999)}
optimizer = pyro.optim.Adam(adam_params)
svi = SVI(model, guide, optimizer, loss=Trace_ELBO())
n_steps = 4000
for step in range(n_steps):
svi.step(data)
if step % 100 == 0:
print(step)
print(pyro.param("qalpha0")/(pyro.param("qalpha0")+pyro.param("qalpha1")))
```

Basically Iām doing the same thing as in the tutorial except that the shape of the data and then the distribution used change (multinomial instead of bernoulli). The problem is that infered f given by the last line is roughly equal to 0.5 instead of 0.6. Iāve already though about the size of the data but with len(data)=2000 that infers perfectly 0.6 when using bernoulli distribution like in the SVI, plus the computation time is pretty high.

Whatās wrong ?