Stochastic gradient mcmc

SG mcmc have gained increased traction and is an attractive alternative to SVI methods for large datasets. is there any work on implementing this in numpyro or pyro?

I’m not aware of any ongoing work on adding stochastic gradient MCMC algorithms to either Pyro or NumPyro. Contributions are always welcome! Feel free to open a feature request issue in either repository, especially if it’s something you’d be interested in working on yourself.

There appear to be some nice reference implementations in PyTorch available here, albeit no longer maintained: https://github.com/MFreidank/pysgmcmc/tree/pytorch

ok thanks. I have not looked too much on the mcmc part of pyro, but could clearly be a potential direction.

To my understanding I could create a new mcmc algorithm by making a new kernel, as the example here?

Would need to compute the gradients/ potential function given parameters for a batch of data instead of the whole likelihood, could you point me where to get that in a numpyro-way?

To my understanding I could create a new mcmc algorithm by making a new kernel, as the example

Yep, that’s right.

Would need to compute the gradients/ potential function given parameters for a batch of data instead of the whole likelihood, could you point me where to get that in a numpyro-way?

I believe you can just use numpyro.infer.util.log_density. If you open a numpyro feature request issue, @fehiepsi should be able to provide much more detailed guidance.

As an implementation for general usage cases, you can have a get_batch or something like that in the constructor of your sg mcmc, then given a rng_key in sample method, you use get_batch(rng_key) to get a batch of models’ args, kwargs. After that, you can proceed as in the current HMC implementation.