- What tutorial are you running?

http://pyro.ai/examples/custom_objectives.html - What version of Pyro are you using?

0.3.1

Hey all,

I’m trying to add mean squared error loss to my objective for a variational autoencoder, so I’m following the custom objectives tutorial, and I have code that is a similar and otherwise functioning version of the vae tutorial. Specifically I’m trying to implement the section, a lower level pattern:

```
# define optimizer and loss function
optimizer = torch.optim.Adam(my_parameters, {"lr": 0.001, "betas": (0.90, 0.999)})
loss_fn = pyro.infer.Trace_ELBO.differentiable_loss
# compute loss
loss = loss_fn(model, guide)
loss.backward()
# take a step and zero the parameter gradients
optimizer.step()
optimizer.zero_grad()
```

I’d like some more detail on how to actually implement this in practice. This is what I have tried:

```
optimizer = torch.optim.Adam(vae.parameters(), lr=1e-3)
elbo_loss_fn = pyro.infer.Trace_ELBO.differentiable_loss
for epoch in range(1000):
epoch_loss = 0.
for x, _ in train_dl:
x = x.cuda()
loss = elbo_loss_fn(model=vae.model, guide=vae.guide)
loss.backward()
optimizer.step()
optimizer.zero_grad()
```

Which gives me the error “TypeError: differentiable_loss() missing 1 required positional argument: ‘self’”. And never looks at the data x. I figure if I can get the elbo loss to work, I can just take the MSE between x and its reconstruction and add it to the elbo loss before finding the gradients with loss.backward().