BNN with option to "freeze" weights

So i have this architecture:

class BayesianRegression(PyroModule):

    def __init__(self):

        super().__init__()

        self.linear1 = PyroModule[nn.Linear](1, HIDDEN_DIM)

        self.linear1.weight = PyroSample(dist.Normal(0., 1.).expand([HIDDEN_DIM, 1]).to_event(2))

        self.linear1.bias = PyroSample(dist.Normal(0., 10.).expand([HIDDEN_DIM]).to_event(1))

        self.linear2 = PyroModule[nn.Linear](HIDDEN_DIM,1)

        self.linear2.weight = PyroSample(dist.Normal(0., 1.).expand([1, HIDDEN_DIM]).to_event(2))

        self.linear2.bias = PyroSample(dist.Normal(0., 10.).expand([1]).to_event(1))

Is there a way to freeze the weights of the the linear layers after one sampling(like in one dropout layer at test time) in order to get constant output?

@chaps I think you can simply replace the weights by constant values:

your_module.linear1.weight = constant / nn.Parameter(constant) / ...

:smiley:

Thank you for the reply :slight_smile: