I’m defining a probabilistic model with some frozen parameters (i.e., weights of a a pre-trained CNN) and have been roughly following the Bayesian Regression tutorial. I’m able to train my model when I’m not using gpus, but when I use gpus I get the following error: `raise ValueError("can't optimize a non-leaf Variable")`

I followed the tutorial for initializing the optimizer; below is a sketch of my model, in which I’m trying to learn a distribution over x_.

```
class Model(nn.Module):
def __init__(self, arch):
super(Model, self).__init__()
self.x_ = torch.nn.Parameter(data=torch.rand(1,3,224,224), requires_grad=True)
self.cnn = get_model(arch) # i.e., a pytorch pretrained network
def forward(self):
return self.cnn(self.x_)
```

I’ve tried the following:

- defining a
`per_param_callable`

function that return`{}`

for all parameters that aren’t`x_`

. - checking/setting the active_params before
`svi.step`

is called (there are none before the first call) - passing in named parameter to the optimizer:
`Adam({'params': 'x_', 'lr': 1e-2})`

(an error about multiple param arguments is thrown, as I believe SVI sets the params)

Any ideas would be much appreciated.