I’m trying to run multiple guides with different initialization (random) using a for loop. But it seems the initial value is same each time guide is called. How to I rectify this?
def guide(): var_mu_1 = pyro.param('var_mu_1', torch.randn(1)) var_mu_sig_1 = pyro.param('var_mu_sig_1', torch.tensor(1.), constraint=constraints.positive) pyro.sample('mu_1', dist.Normal(loc=var_mu_1, scale=var_mu_sig_1))
I want torch.randn to have a different value each time I call guide in SVI