Manually updating the learning rate of ADAM optimizer

LS,

In an earlier postI indicated that I experienced some confusing behaviour when using the ADAM optimiser in conjunction with the ReduceLROnPlateau scheduler. In short, I did not see the learning rate change when it should according to the change in training loss, and I only saw it change when it exceeded the scheduler threshold.

I would like to try to manually change the optimiser learning rates, initially based on just the training loss change (much like ReduceLROnPlateau), but as a next step I want to introduce a secondary heuristic (some secondary quantity that in my project measures ‘correctness’). Being able to change the learning rate manually, and avoiding the use of a scheduler to handle this, sounds, given the circumstances, the best way to go.

My question is then:

What is the most appropriate way to manually set the learning rate in the pyro version of the Adam optimiser?

Or should I just do something like

#############
# Configure optimiser
adam = optim.Adam({"lr": initial_learning_rate})

#
<train>

# 
for param_group in adam.param_groups:
    param_group['lr'] = custom_lr