So i am using the tutorial here as my code basis: https://pyro.ai/examples/bayesian_regression_ii.html#Comparing-Posterior-Distributions

In the tutorial, ruggedness, log_gdp and is_cont_africa are of dimension 1 with 170 samples. The shape of the vectors is therefore (170,). mean is of shape (s,) while sigma is just a number.

I am now adapting this tutorial to my data, which has an x of shape (s, features) and y of shape (s, labels). Basically it’s the same just with one additional dimension. I have created a polynomial regression term which seems to work out (n = labels, m = features):

```
A = pyro.sample("a", dist.Normal(torch.zeros(n, m), 10. * torch.ones(n, m)))
B = pyro.sample("b", dist.Normal(torch.zeros(n, m), 10. * torch.ones(n, m)))
C = pyro.sample("c", dist.Normal(torch.zeros(n), 10. * torch.ones(n)))
mean = (torch.mm(A, (x * x).T) + torch.mm(B, x.T) + C.view(-1, 1)).T
sigma = pyro.sample("sigma", dist.Uniform(torch.zeros(n), 10. * torch.ones(n)))
```

Mean is of shape (s, features), sigma is of shape (features, ), analogous to the tutorial.

However, this code makes problems:

```
with pyro.plate("data", x.shape[0]):
pyro.sample("obs", dist.Normal(mean, sigma), obs=y)
```

In my case features = 180; labels = 177; s = 10540

ValueError: Shape mismatch inside plate(‘data’) at site obs dim -1, 10540 vs 177

Trace Shapes:

Param Sites:

Sample Sites:

a dist 177 180 |

value 177 180 |

b dist 177 180 |

value 177 180 |

c dist 177 |

value 177 |

sigma dist 177 |

value 177 |

data dist |

value 10540 |

What am i getting wrong here? I have placed some asserts so make sure the shapes are identical to the tutorial except for the extra dimension. Thanks so much!

It seems that *dist.Normal(mean, sigma)* is the culprit. In the tutorial, this is *dist.Normal( shape: (177), number)* while for me that is *dist.Normal( shape: (s, labels), shape(labels,))*. I guess it has to look differently somehow?