I am not sure whether your implementation is correct.
Taking one example from the documentation,
optimizer = torch.optim.SGD
scheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': 0.01}, 'gamma': 0.1})
svi = SVI(model, guide, scheduler, loss=TraceGraph_ELBO())
for i in range(epochs):
for minibatch in DataLoader(dataset, batch_size):
svi.step(minibatch)
scheduler.step()
pyro.optim.Adam is an optimizer, not a scheduler. Optimizers don’t have a .step() method, this happens in the svi object. Perhaps take a look at the tutorial for SVI.
# define optimizer and loss function
optimizer = torch.optim.Adam(my_parameters, {"lr": 0.001, "betas": (0.90, 0.999)})
loss_fn = pyro.infer.Trace_ELBO().differentiable_loss
# compute loss
loss = loss_fn(model, guide, model_and_guide_args)
loss.backward()
# take a step and zero the parameter gradients
optimizer.step()
optimizer.zero_grad()