Learning Rate Dynamic

Hello!
I am currently trying to implement dynamic learning rates in pyro I am not sure how to do it. I have been using the wrappers for the schedulers in PyTorch like scheduler = optim.StepLR(optim_params). However, it seems that it only updates once per epoch and I am not sure how to do that here. I read that there was a method to set the epoch manually but it is deprecated. Any suggestions on how to use a scheduler for svi?
Thank you!

there is an example here

Thank you!
I understand now how to do it