Hi Folks,

I am trying to use Pyro’s predictive method with `parallel=True`

. For some reason it’s not working in parallel but it works just fine without it.

Here is my dummy setup.

```
import pyro
import torch
import pyro.distributions as dist
# Model and Guide
def model(X, obs = None):
weight = pyro.sample('weight', dist.Normal(0., 1.).expand([2]).to_event(1))
y_mean = torch.matmul(X, weight.T)
sigma = pyro.sample('sigma', dist.Uniform(0., 1))
with pyro.plate('data', X.shape[0]):
y = pyro.sample('y', dist.Normal(y_mean, sigma), obs=obs)
guide = pyro.infer.autoguide.AutoDiagonalNormal(model)
# Sample training
X = torch.ones((1000,2))
X[:,1] = torch.tensor(pyro.sample('X', dist.Normal(0., 1.).expand([1000])).detach().numpy())
y = X[:,1] * 2 + X[:,0] * 3
pyro.clear_param_store()
svi = pyro.infer.SVI(model, guide, pyro.optim.Adam({'lr': 1e-2}), loss=pyro.infer.Trace_ELBO())
losses = []
num_steps = 2500
for t in range(num_steps):
loss = svi.step(X, y)
losses.append(loss)
# print loss
if t % 100 == 0:
print('step: {} loss: {:.3f}'.format(t, loss))
# With parallel=False
predictive = pyro.infer.Predictive(model, guide=guide, num_samples=10000, parallel=False)
samples = predictive(X)
# With parallel=True
predictive = pyro.infer.Predictive(model, guide=guide, num_samples=10000, parallel=True)
samples = predictive(X)
```

I have tried following to solve this.

- Put everything in one single plate
- Create a new play that wraps the above code in model

But nothing worked. I am wondering is there any bug in my current implementation?