AttributeError: 'PyroOptim' object has no attribute 'step'

Hello,
For my Pyro model, I’ve defined my scheduler as the following:

scheduler = pyro.optim.Adam({'lr': 0.0055})

When I do:

scheduler.step()

The following error appears:

  File "mycode.py", line 347, in train_loop
    scheduler.step() # update the learning rate
AttributeError: 'PyroOptim' object has no attribute 'step'

How can I fix this error? I wasn’t expecting this kind of error.

Thank you,

Hello,
I have a follow-up question on this topic.

optimizer_4 = pyro.optim.SGD
scheduler_4 = pyro.optim.ClippedAdam({'optimizer': optimizer_4, 'lr': 0.0055})
scheduler_4.step()

The line scheduler_4.step() still generates an error:


  File "<ipython-input-88-6ef9ffff63c9>", line 1, in <module>
    scheduler_4.step()

AttributeError: 'PyroOptim' object has no attribute 'step'

How can I solve this issue? Thank you,

My guess would be to use the scheduler_4 object as your optimizer directly in your SVI.

Hello,

Yes, in my code, I have directly incorporated scheduler_4 object in my SVI:

svi_delta = SVI(model, guide_delta, scheduler_4, loss=TraceEnum_ELBO(max_plate_nesting=0))

However, this error keeps on appeari when I execute my train_loop

I am not sure whether your implementation is correct.
Taking one example from the documentation,

optimizer = torch.optim.SGD
scheduler = pyro.optim.ExponentialLR({'optimizer': optimizer, 'optim_args': {'lr': 0.01}, 'gamma': 0.1})
svi = SVI(model, guide, scheduler, loss=TraceGraph_ELBO())
for i in range(epochs):
    for minibatch in DataLoader(dataset, batch_size):
        svi.step(minibatch)
    scheduler.step()

Maybe, you can try the above .

When I use SGD and ExponentialLR in my code, .step() does work.
However, with pyro.optim.Adam, the .step() seems not working…

pyro.optim.Adam is an optimizer, not a scheduler. Optimizers don’t have a .step() method, this happens in the svi object. Perhaps take a look at the tutorial for SVI.

1 Like

I’m a bit confused about this answer. This code:

# define optimizer and loss function
optimizer = torch.optim.Adam(my_parameters, {"lr": 0.001, "betas": (0.90, 0.999)})
loss_fn = pyro.infer.Trace_ELBO().differentiable_loss
# compute loss
loss = loss_fn(model, guide, model_and_guide_args)
loss.backward()
# take a step and zero the parameter gradients
optimizer.step()
optimizer.zero_grad()

is from the pyro custom SVI objectives tutorial: Customizing SVI objectives and training loops — Pyro Tutorials 1.8.4 documentation

but throws this error.

@jspence could you paste some complete code to reproduce the error? Note that torch.optim.Adam has a different interface from pyro.optim.Adam.

@fritzo
Ah! Thank you so much. That’s exactly what it was. I had imported Adam from pyro.optim instead of torch.optim.