Problem with param store

Hi! I was trying to extract gradient for plotting like that in the GMM tutorial. In the tutorial, they registered hook right before training(in particular, running svi.step()):

# Register hooks to monitor gradient norms.
gradient_norms = defaultdict(list)
for name, value in pyro.get_param_store().named_parameters():
    value.register_hook(lambda g, name=name: gradient_norms[name].append(g.norm().item()))

So I wrote a SSVAE with the same procedure with that described in the VAE and SSVAE tutorial, and put these lines before training, yet pyro.get_param_store() gives me an empty dictionary.

Thus my question is what happened and when does Pyro get all the parameters stored? Because in my mind, all the parameters get registered in Pyro when pyro.module("ss_vae", self) is executed, and that won’t happen until svi.step() is executed, so of course it will give me an empty dict. (But that couldn’t be the problem since when I check gradient_norms after running svi.step(), it’s still an empty dictionary.)

Hi, please provide enough code to reproduce your problem.

Hi, thanks for replying.

So the code is just the SSVAE code in the example on the Pyro github, the only things I changed are:

  1. register modules in initialization of SSVAE:
...
    def setup_networks(self):
    ...
    self.encoder_y = ...
    self.encoder_z = ...
    self.decoder = ...
    pyro.module("ss_vae", self)
    ...
...
  1. register hook before training
def run_inference_for_epoch(data_loaders, losses, periodic_interval_batches):
    '''
    ...
    '''
    gradient_norms = defaultdict(list)
    for name, value in pyro.get_param_store().named_parameters():
        value.register_hook(lambda g, name=name: gradient_norms[name].append(g.norm().item()))
    ...

And nothing is added to gradient_norms after running each .step()