I am would like to use LBFGS and StaticSVI for MAP estimation instead of AutoGuide/AutoDelta combined with other optimizers since the latest is not giving me stable results,
I have been investigating several threads about the topic:
and followed the examples, like this:
loss_fn = TraceEnum_ELBO(max_plate_nesting=2).differentiable_loss
with pyro.poutine.trace(param_only=True) as param_capture:
loss_fn(model, global_guide,data_obs)
params = [site[“value”].unconstrained() for site in param_capture.trace.nodes.values()]
optim = torch.optim.LBFGS(params, lr=0.1)
elbo= Trace_ELBO()
svi = StaticSVI(model, global_guide, optim, loss=Trace_ELBO())
However, obviously, pyro’s SVI doesn’t work with not default optimizers and raises the error:
raise ValueError("Optimizer should be an instance of pyro.optim.PyroOptim class.")
ValueError: Optimizer should be an instance of pyro.optim.PyroOptim class.
So maybe I am missing something from the threads but, is there any possibility to make this 2 work together? I can make StaticSVI work with the default pyro optimizers. Is there any plan to implement LBFGS?
Hi!
Thanks so much for your reply. I have pyro’s stable release, are StaticSVI and LBFGS implemented in the pyro’s developers release? Because they are not in either pyro.infer or pyro.optim, in my pyro version (I have upgraded it to pyro-ppl-0.3.1.post1 )
from static_svi import StaticsSVI
global_guide = AutoDelta(model)
optim = torch.optim.LBFGS({}) #I have to use this
elbo= Trace_ELBO()
svi = StaticSVI(model, global_guide, optim, loss=Trace_ELBO())
but this gives the error:
raise ValueError(“optimizer got an empty parameter list”)
Again, I am following your tutorial (test_static_svi.py). If I follow other tutorials , I will feed it the parameters like this:
loss_fn = TraceEnum_ELBO(max_plate_nesting=2).differentiable_loss
with pyro.poutine.trace(param_only=True) as param_capture:
loss_fn(model, global_guide,data_obs)
params = [site[“value”].unconstrained() for site in param_capture.trace.nodes.values()]
optim = torch.optim.LBFGS(params)
elbo= Trace_ELBO()
svi = StaticSVI(model, global_guide, optim, loss=Trace_ELBO())
LBFGS would be statisfied but then StaticSVI will complain:
raise ValueError(“Optimizer should be an instance of pyro.optim.PyroOptim class.”)
so…where are StaticSVI and LBFGS really implemented within pyro?
@artistworking StaticSVI is not merged into Pyro because it is not necessary (you can follow the discussion in that PR). Your error seems like a bug in that PR to me, so please replace pyro.optim.PyroOptim by torch.optim.Optimizer in that PR. For example,
if not isinstance(optim, torch.optim.Optimizer):
raise ValueError("Optimizer should be an instance of torch.optim.Optimizer class.")
I replaced the exception error, but SVIStatics also calls SVI in pyro and the error propagates there, so …I also changed the exception error in the svi.py file. I made it a little more flexible because otherwise other scripts where I use the pyro.optim will not work:
optimtypes=(torch.optim.Optimizer,pyro.optim.PyroOptim)
if not isinstance(optim,optimtypes):
raise ValueError(“Optimizer should be an instance of torch.optim.PyroOptim class or pyro.optim.PyroOptim.”)
Hmmm, but now I run into the next error:
File “/home/lys/Dropbox/PhD/Superpositioning/Calling_Superposition.py”, line 63, in
T1,T2,R,M,X1,X2 = Run(data_obs,iterations,average)
File “/home/lys/Dropbox/PhD/Superpositioning/Superposition_StaticSVI_NotEarlyStop.py”, line 230, in Run
svi.step(data_obs)
File “/home/lys/Dropbox/PhD/Superpositioning/static_svi.py”, line 58, in step
self._setup(*args, **kwargs)
File “/home/lys/Dropbox/PhD/Superpositioning/static_svi.py”, line 47, in _setup
self._pt_optim = self.optim.pt_optim_constructor(self._params, **self.optim.pt_optim_args) AttributeError: ‘LBFGS’ object has no attribute ‘pt_optim_constructor’
and maybe this is a little out of my skills …because I don’t know about pt_optim_constructors …
@artistworking Sorry for the confusion. LBFGS is not exposed to pyro.optim by default. You’ll need to delete these lines to be able to use it. In addition, it seems that the PR uses pyro.optim, not torch.optim, so no need to make the change in my last comment.
Under the hood, self.optim.pt_optim_constructor is torch.optim.LBFGS; self._params is your model/guide parameters; self.optim.pt_optim_args is LBFGS’s hyper-parameters such as lr and max_iter.