Turn off optimization for certain neural network layers

Hey all!

Consider that I have a neural network-based guide to learn how to do inference in a pyro generative program. The thing is, some layers of my guide are not supposed to be optimized. I am currently approaching this by not registering in the Pyro this nn.module as a pyro.module. Imagine that my guide has self.g1 and self.g2 neural modules; if at the beginning of my forward() function I only register g1, i.e. pyro.module("g1", self.g1), it means my optimizer won’t update the parameters of self.g2 right? Thank you in advance!