Would my code successfully retrieve the best trained guide from the training?

Hello,
I am relatively new to using PyTorch for deep learning.
Would my code below successfully retrieve the best trained guide (best_guide) from the training (i.e.would my training loop allow me to get the best trained guide from the training)?

:S thank you,

optimizer_args = {'lr': 0.00015}
optimizer_4 = torch.optim.Adam
scheduler_args = {'optimizer': optimizer_4,
                          'step_size' : 1, 'gamma' : 1.5, 
                          'optim_args' : optimizer_args}
scheduler_4 = pyro.optim.StepLR(scheduler_args)

# create the stochastic variational inference (SVI) object.
my_svi= SVI(myPyroModel, my_guide, scheduler_4,
                             loss = TraceEnum_ELBO(max_plate_nesting=0))
# turn on a training mode
myPyroModel.train()

# TRAINING LOOP
for i in range(num_epoch):    
     # initialize total_loss to 0
     total_svi_loss = 0    

     # calculate the loss and take a gradient step
     svi_loss = my_svi.step(input)

     # update the with the calculated loss 
     total_svi_loss = total_svi_loss + svi_loss 

     if m % log_interval == 0 and m > 0:
          cur_svi_loss = total_svi_loss / log_interval
          print('| epoch {:3d}  | loss {:5.4f} |'.format(
                    epoch,cur_svi_loss ))
                   
     total_svi_loss = 0 

     if cur_svi_loss < best_svi_loss:
          best_val_loss = cur_svi_loss
          best_guide = my_guide

I don’t think the code as written does what you want it to do.

Are you trying to freeze the guide when you achieve best loss?
The idea of VI/MCMC is that the longer you let it run the more it will approximate your posterior (and in MCMC case hopefuly converges). However, it seems like you have a point in your model where you think your loss will degenerate, which could mean you have dodgy priors. I would suggest doing prior predictive checks first.

But to give an answer, best_guide = my_guide is doing a soft reference to the my_guide objects. when you call my_svi.step(input), the VI object is actually calling the optimiser that backpropgate and updates the weights of your tensors graph. The weights are stored in a parameter store, which to extract you have to do extra work, check this http://docs.pyro.ai/en/0.2.1-release/parameters.html (and the two links it references)

Hello,

Thank you for your reply.
I have trained my guide with the training loop that I showed in my previous post, and when I tried to make predictions with the best_guide, the training doesn’t seem to be taking any effect… the accuracy rate of my “trained” Bayesian model is only 26%, but my training log depicts that the validation loss has reduced down significantly after I ran my originial training loop.

How should I fix my training loop so that I can make predictions with the trained guide? Below is what I tried (i.e. my previous code):

optimizer_args = {'lr': 0.00015}
optimizer_4 = torch.optim.Adam
scheduler_args = {'optimizer': optimizer_4,
                          'step_size' : 1, 'gamma' : 1.5, 
                          'optim_args' : optimizer_args}
scheduler_4 = pyro.optim.StepLR(scheduler_args)

# create the stochastic variational inference (SVI) object.
my_svi= SVI(myPyroModel, my_guide, scheduler_4,
                             loss = TraceEnum_ELBO(max_plate_nesting=0))
# turn on a training mode
myPyroModel.train()

# TRAINING LOOP
for i in range(num_epoch):    
     # initialize total_loss to 0
     total_svi_loss = 0    

     # calculate the loss and take a gradient step
     svi_loss = my_svi.step(input)

     # update the with the calculated loss 
     total_svi_loss = total_svi_loss + svi_loss 

     if m % log_interval == 0 and m > 0:
          cur_svi_loss = total_svi_loss / log_interval
          print('| epoch {:3d}  | loss {:5.4f} |'.format(
                    epoch,cur_svi_loss ))
                   
     total_svi_loss = 0 

     if cur_svi_loss < best_svi_loss:
          best_val_loss = cur_svi_loss
          best_guide = my_guide

# make predictions
myPyroModel.eval()
pred_obj = Predictive(myPyroModel, guide=best_guide,
                                  num_samples = 100)

mc_prediction_scores = pred_obj.call(...)

Thank you for your help,

Best guide as I have mentioned isn’t doing anything there, you can pass guide and it should have the same effect.
Check this pyro example for using your trained posterior in prediction
https://pyro.ai/examples/gmm.html?highlight=predicting#Serving-the-model:-predicting-membership