-
Setting
has_rsample=True
returns the following error:
TypeError: factor() got an unexpected keyword argument 'has_rsample'
. -
For the custom objective function, I have the following setup:
def model(data):
x_loc = torch.zeros(N*3,)
x_scale = 2*torch.ones(N*3,)
x = pyro.sample("x", dist.Normal(x_loc, x_scale).to_event(1))
....
def guide(data):
x_loc = pyro.param("x_loc", torch.rand(N*3,))
x_scale = pyro.param("x_scale", 0.5*torch.ones(N*3,), constraint=constraints.positive)
x = pyro.sample("x", dist.Normal(x_loc, x_scale).to_event(1))
elbo_loss_fn = pyro.infer.Trace_ELBO().differentiable_loss
def loss_fn(data):
elbo_loss = elbo_loss_fn(model, guide, data)
x_loc = pyro.param("x_loc")
reg_loss = L2_regularizer(x_loc)
return elbo_loss + reg_loss
# optimizer
optimizer = torch.optim.Adam(my_parameters, {"lr": 0.001, "betas": (0.90, 0.999)})
for i in range(num_steps):
loss = loss_fn(data=data_obs)
loss.backward()
optimizer.step()
optimizer.zero_grad()
What I am confused about is how to get the my_parameters
in optimizer
defined above? For my case, the parameters are defined in the guide x_loc
and x_scale
.