Syntax with clip_args for pyro.optim.LambdaLR

I would like to use pyro.optim.LambdaLR to set the learning rate schedule of torch.optim.SGD and also do gradient clipping when the gradient norms get too large. I have not managed to specify the clip_args successfully, however. The following call to SVI works (I’m replacing the power with ^ just cause the bolded display looks weird).

Working code
optimizer = torch.optim.SGD
lambda1 = lambda epoch: (self._offset+epoch)^(-self._fr)
self._optim = pyro.optim.LambdaLR({‘optimizer’: optimizer, ‘optim_args’: {‘lr’: self._offset^(-self._fr)}, ‘lr_lambda’: lambda1})
self._elbo = Trace_ELBO()
self._svi = SVI(self.model, self.guide_rand_ss, self._optim, loss=self._elbo)

But when I do

Not working code

self._optim = pyro.optim.LambdaLR({‘optimizer’: optimizer, ‘optim_args’: {‘lr’: self._offset**(-self._fr)}, ‘lr_lambda’: lambda1, ‘clip_args’: {“clip_norm”:10.0}}})

I get the typeError when running self._svi.step()

Error message
TypeError: init() got an unexpected keyword argument ‘clip_args’

If you can show me the right syntax that would be awesome, thanks in advance!

Hi, looks like you need to pass clip_args as a separate keyword argument to pyro.optim.LambdaLR:

pyro.optim.LambdaLR({'optimizer': optimizer, 'optim_args': {...}, 'lr_lambda': lambda1},
                    clip_args={"clip_norm":10.0})
1 Like

Thanks, that did the job!