Thank You for the clarification @neerajprad.

The whole model was slightly more complicated and that was the reasons for using *irange*:

### Data:

def genData():

x0=np.random.randn(10)*1.0+15.0

x1=np.random.randn(10)*1.0+20.0

x2=np.random.randn(10)*1.0+10.0

x3=np.random.randn(10)*1.0+0.0

x4=np.random.randn(10)*1.0+20.0

x5=np.random.randn(10)*1.0+0.0

return torch.tensor([ [x0[i],x1[i],x2[i],3[i],x4[i],x5[i]] for i in range(10)])

### First model attempt using irange:

def model(data):

a_mu=pyro.sample(“a_mu”,Normal(loc=0.0,scale=1.0))

a_sigma=pyro.sample(“a_sigma”,Normal(loc=0.0,scale=1.0))

```
b_mu=pyro.sample("b_mu",Normal(loc=0.0,scale=1.0))
b_sigma=pyro.sample("b_sigma",Normal(loc=0.0,scale=1.0))
for i in irange("range",data.shape[0]):
x1_x0=data[i][1]-data[i][0]
x0=data[i][0]
x1=data[i][1]
x2=data[i][2]
x3=data[i][3]
x4=data[i][4]
x5=data[i][5]
pyro.sample(f"data_{i}",Normal(loc=(x4-x0)*a_mu + (x5-x0)*a_sigma+(x2-x0)*b_mu+(x3-x0)*b_sigma,scale=1.0),obs=x1_x0)
```

### Model expressed as matrix (after re-arranging the equation):

def model_V2(data):

a_mu=pyro.sample(“a_mu”,Normal(loc=0.0,scale=10.0))

a_sigma=pyro.sample(“a_sigma”,Normal(loc=0.0,scale=10.0))

```
b_mu=pyro.sample("b_mu",Normal(loc=0.0,scale=10.0))
b_sigma=pyro.sample("b_sigma",Normal(loc=0.0,scale=10.0))
y=torch.matmul(data,torch.tensor([-a_mu-a_sigma-b_mu-b_sigma+1,-1,b_mu,b_sigma,a_mu,a_sigma]))
pyro.sample("data",Normal(loc=y, scale=torch.ones(10)),obs=torch.zeros(10))
```

I believe model and model_V2 are equivalent (I may have done some mistake re-arranging the equation but conceptually they should be equal).

Are my assumption correct?

This version anyway seems much slower.

Going back to Your comment (just for my understanding) I guess I can re-write Your version using expand_by (that to me seems more readable):

def model(data):

mu = pyro.sample(“mu”, Normal(loc=torch.tensor(0.0), scale=torch.tensor(1.0)))

pyro.sample(“data”, Normal(loc=mu, scale=torch.tensor(1.0)).expand_by([10]), obs=data)

Again: I’m I right?

Thank You for Your time.