Stochastic gradient mcmc

SG mcmc have gained increased traction and is an attractive alternative to SVI methods for large datasets. is there any work on implementing this in numpyro or pyro?

1 Like

I’m not aware of any ongoing work on adding stochastic gradient MCMC algorithms to either Pyro or NumPyro. Contributions are always welcome! Feel free to open a feature request issue in either repository, especially if it’s something you’d be interested in working on yourself.

There appear to be some nice reference implementations in PyTorch available here, albeit no longer maintained: GitHub - MFreidank/pysgmcmc at pytorch

ok thanks. I have not looked too much on the mcmc part of pyro, but could clearly be a potential direction.

To my understanding I could create a new mcmc algorithm by making a new kernel, as the example here?

Would need to compute the gradients/ potential function given parameters for a batch of data instead of the whole likelihood, could you point me where to get that in a numpyro-way?

To my understanding I could create a new mcmc algorithm by making a new kernel, as the example

Yep, that’s right.

Would need to compute the gradients/ potential function given parameters for a batch of data instead of the whole likelihood, could you point me where to get that in a numpyro-way?

I believe you can just use numpyro.infer.util.log_density. If you open a numpyro feature request issue, @fehiepsi should be able to provide much more detailed guidance.

As an implementation for general usage cases, you can have a get_batch or something like that in the constructor of your sg mcmc, then given a rng_key in sample method, you use get_batch(rng_key) to get a batch of models’ args, kwargs. After that, you can proceed as in the current HMC implementation.

Hi all,

Has there been a feature request opened? I may want to work on this (with some guidance from @fehiepsi, hopefully).
There is a nice paper examining the performance of SG-MCMC methods and it comes with R+Tensorflow implementation (sgmcmc). I would be interested in porting one algorithm (SG-HMC) to Numpyro.

Hi,
I did not start on this, but focused on implementing something closer to pytorch directly first: Ive extracted it out here (taken from some other draft code): Two simple implementations of sg-mcmc · GitHub

It means that you would have to implement logprior() and loglik() directly i pytorch, but for gaussian priors that should not be too difficult. Of course, tracing etc etc needs to be handled as well.

Hi there,

Thank you for this. I will probably start working on a Numpyro kernel for myself building from the existing HMC kernel, as suggested by @fehiepsi. I’ll see how far I can get and circle back.

1 Like

Hi @Elchorro, regarding getting a batch of data, we have subsample parameter, which allows you to get a batch of data. The SGLD algorithm in your reference should be not complicated to implement. SG-HMC seems to be more involved. If you have any questions, please let me know.

Btw, it seems to me from your reference that SG MCMC performance is pretty poor in terms of effective sample size per minute comparing to the usual HMC. Probably it is more scalable… It is interesting to see how its performance comparing to SVI.