How to feed data to get prediction from auto guide

normally, in self defined guide, we can sample from variational distribution through guide(None, None), then feed x in to the sample to get the output logits, then we can estimate the results. but when using auto guide, we can only sample to get a latent tensor. how can we use such tensor to get the output logits.

You can use a similar technique, except that you’ll be sampling from the autoguide instead.

preds = []
for _ in range(1000):
    guide_trace = poutine.trace(autoguide).get_trace(x)
    # assuming that the original model took in data as (x, y) where y is observed
    preds.append(poutine.replay(model, guide_trace)(x, None))  
return preds # samples from the predictive distribution.

sorry, the guide_trace here is a dict, when I run poutine.replay(model, guide_trace)(x,none)
it has an error ‘dict’ object has no attribute ‘nodes’

Sorry, that was my mistake! Just corrected it in the snippet above - you need to get the execution trace via poutine.trace(autoguide).get_trace(...). Does that work?

I am using pyro-ppl 0.3.1.post1, and there is no function called get_execution_trace

Again, my bad - try the snippet above, it should be .get_trace.

I think I can get the guide trace through
poutine.trace(self.guide).get_trace(x)

but still poutine.replay(model, guide_trace)(x, None) returns None, is it possible the second function is wrong

because I am dealing with MNIST, so at least the output shape should be right, thank you for your time

This should return whatever your model returns, so I am guessing that your model does not have a return statement? If you cannot return your model predictions, you will instead have to wrap this inside another poutine.trace(..).get_trace(x, None) to get the execution trace and then analyze that directly (e.g. trace.nodes["y"]["value"]).

thanks, cause I returned pyro.random_module

I find I have to do it in this way
a_trace = poutine.trace(guide).get_trace(x)
pred = poutine.replay(model, trace=a_trace)(x, None).forward(x)

to get the prediction.
I am a little bit confused about the .forward(x) … looks like a redundant. but the good thing is its working now. Thanks

no it seems like you returned a call to random_module, not the function itself which explains below:

you returned an nn module in your model instead of the output data so you need to run the resulting nn on data to get a value. if you modified your model you wouldn’t need the forward call. it’d be easy to explain what’s happening if you posted your model.