Hi. I am new user of pyro.
I wrote code of Probabilistic-Programming-and-Bayesian-Methods-for-Hackersbayesian-chapter1 in pyro, however that code is not working well.
I just only replace tfp code into pyro.
def model(data):
alpha = (1. / data.mean())
lambda1 = pyro.sample("lambda1", dist.Exponential(rate=alpha))
lambda2 = pyro.sample("lambda2", dist.Exponential(rate=alpha))
tau = pyro.sample("tau", dist.Uniform(0, 1))
lambda_ = torch.gather(torch.tensor([lambda1, lambda2]), 0,
(tau*data.size(0) <= torch.arange(len(data)).float()).long())
with pyro.plate("data", data.size(0)):
pyro.sample("obs", dist.Poisson(lambda_), obs=data)
nuts_kernel = NUTS(model,
jit_compile=True,
adapt_step_size=True,
transforms={
"lambda1": torch.distributions.transforms.ExpTransform(),
"lambda2": torch.distributions.transforms.ExpTransform(),
"tau": torch.distributions.transforms.SigmoidTransform()
}
)
posterior = MCMC(nuts_kernel,
num_samples=1000,
warmup_steps=300,
).run(count_data)
I think tfp need writing log-joint-prob manually, but pyro need only generative model and automatically get the log-joint-prob. Therefore, because lambda_ = torch.gather
is not from pyro.sample()
, mcmc sampler cannot get the log-joint-prob correctly right?
How can I modify that code. Please teach me. Thank you.