How to run pyro model in C++?

Sorry, this might be a stupid question.
If I have a model trained in pyro, how can I use it to predict under caffe?
I can translate normal torch code into coffee, but how in pyro?
Anyone has the experience?

update:
The basic aim for me is not to use caffe. Our production line is based on C++ to handle the data process. So my aim is to use the the model learned from pyro in C++.

One way, I think, is to use C++ call python API, so that I can use the python code directly.
Another way, I think, is to use possible pytorch C++ API.

I wonder the two approaches are OK for pyro, or which is better? Thanks a lot!

Pyro inference algorithms / models use PyTorch tensors and operations, so if you are able to translate PyTorch to caffe2, I think it should be straightforward to translate Pyro code perhaps with some minor modifications. I do not have experience exporting models to caffe2, but this is something we would like to better understand and have some future tutorials on. Let me know how your experience goes.

Thanks. In fact, my key problem is to use pyro in C++ way. I notice that pytorch has the C++ API in 1.0, so maybe I can use pyro in that way?

disclaimer: the following is untested and based on my (shallow) understanding of onnx:
if you implement your model as an nn module, you should directly be able to to use torch to export your model to onnx and load it from caffe2. unfortunately, everything else is under development, even simple things such as directly migrating tensors from pytorch to caffe.

note that onnx works by tracing your model, so it needs to be static and use the trace-supported operators.

Thanks! While, I’m not familiar with onnx, and pytorch c++ api. Is this the approach how pytorch c++ api works?

Previously, I use TensorFlow, which has both the C++ API and Java API. I’ve had the experience in Java API.

Later when I want to construct the deep generative model, I find tensorflow/probability and pyro. I am quite interested in pyro, and its document is great. Compared with pyro, tensorflow/probability seems to be lack of rich of features like different ELBOs and NUTS for sampling (Maybe I am wrong).

But when the model is done, I want to use it in the production. I got confused. So should I take the time to transform the pyro model to tensorflow/probability?

Here’s an official PyTorch tutorial on using the PyTorch JIT and the new C++ API to transfer PyTorch models to C++. You’ll need to be more specific about the inference algorithm you’re using and the nature of the predictions you want to make, since the answers to your questions depend on those details. For example, if you’re using SVI, you can probably use torch.jit.trace to trace your guide as in the tutorial and make predictions in C++ using the traced guide, if you’re using HMC you can use torch.jit.trace with TracePredictive, and if you’re using parallel enumeration you can use torch.jit.trace to trace Pyro’s infer_discrete utility or TraceEnum_ELBO.compute_marginals applied to your model.

Questions about Caffe, the PyTorch JIT, and the PyTorch C++ API that are not specific to Pyro are probably better suited to the PyTorch forum, where there are more people who would be able to help.

Oh, I see. Thanks for your reply, and I will limit my question directly related to pyro.