Hello,
I would like to train models using Bayesian Inference (for example using NUTS), so that there are coefficients working at different levels. This is an example:
level 1 | level 2 | a0 | a1 | y
A | A1 | 1.0 | 1 | 10
A | A1 | 9.2 | 0.9| 13.2
A | A2 | 1.3 | 1.1| 13.4
A | A2 | 1.4 | 1 | 12
B | B1 | 1.0 | 2 | 9
[...]
I would need a way to train linear models to predict y. The trick here: I need a0 and a1 to be trained at different levels: I need a coefficient for a0 for each different level 1, and a coefficient for a1 for each level 2.
Any idea how I can write such a model with Pyro?
I’m not quite sure what ‘level’ means here, but it sounds a bit like a mixed effect model? Maybe something like
level1 = torch.LongTensor([0,0,0,0,1])
level2 = torch.LongTensor([0,0,1,1,2])
a0 = torch.tensor([1., 9.2, 1.3, 1.4, 1.])
a1 = torch.tensor([1., 0.9, 1.1, 1., 2.])
y = torch.tensor([10., 13.2, 13.4, 12., 9.])
def model():
weight0 = pyro.param("weight0", torch.zeros(2)) # two distinct levels
weight1 = pyro.param("weight1", torch.zeros(3)) # three distinct levels
bais = pyro.param("bias", torch.tensor(0.))
noise = pyro.param("noise", torch.tensor(1.),
constraint=constraints.positive)
with pyro.iarange("data", len(y)):
prediction = a0 * weight0[level1] + a1 * weight1[level2] + bias
pyro.sample("y", dist.Normal(prediction, noise), obs=y)
1 Like
That seems to be what I needed indeed! Left to see if having the dimension of weight0 to be around 10K and of weight1 around 90K doesn’t make it impossible to train.