Dynamic change poutine.scale


I use lenet5 for mnist classification. but the network performance is extremely bad, I notice some people say KL annealing would be helpful, to scale the KL term in the ELBO, one potential method is using poutine.scale as following

SVI(poutine.scale(model, scalar),poutine.scale( guide, scalar), opt, loss=Trace_ELBO())

the scalar is the scale parameter. during training, we use svi.step()
but how can we change the scalar value during training?


Hi @yikuanlee_pyro, to fully control what you want to change, I recommend to read custom objectives tutorial KL annelling part or take a look at mixed hmm example. Please let me know if you find something is still unclear to you. You can also define a wrapper for your model and guide such as:

scalar = 1.
counter = 0.

def wrap_model(*args, **kwargs):
    return poutine.scale(model, scalar)(*args, **kwargs)

def wrap_guide(*args, **kwargs):
    if counter == 10:
        scalar = new_scalar
    counter += 1
    return poutine.scale(guide, scalar)(*args, **kwargs)


thanks, very useful. the wrap guide is an easy way to modify the KL in a more concise way.