Is there a get_log_normalizer function for MCMC?

I really appreciate that there’s a get_log_normalizer implemented in pyro.infer.Importance.

I was wondering if there’s a nice way to get the log normalizer (or say, the model evidence) when I use MCMC.

In the following MWE, mcmc.get_samples() only returns sampled values, not the log weights of traces. Is there a recommended way to recover those log weights of sampled values?

import pyro
import torch
import math
import pyro.distributions as dist
import pyro.poutine as poutine
from pyro.infer import MCMC, NUTS, Importance

num_samples = 10

def model():
    p = pyro.sample(f"beta", dist.Beta(1, 1))
    pyro.sample(f"obs", dist.Bernoulli(p), obs=torch.ones(1))

if __name__ == '__main__':

    importance = Importance(model, guide=None, num_samples=num_samples)
    posterior =
    me = posterior.get_log_normalizer()

    # tensor(-0.6645)
    # 0.5145261908404379

    nuts_kernel = NUTS(model)
    mcmc = MCMC(nuts_kernel, num_samples=num_samples)
    samples = mcmc.get_samples()
    traces = [
        poutine.trace(poutine.condition(model, {"beta": val})).get_trace()
        for val in samples["beta"]
    log_weights = [t.log_prob_sum() for t in traces]

    me = torch.logsumexp(torch.tensor(log_weights), dim=-1) - torch.log(torch.tensor(len(log_weights)))

According to GitHub issue 1930 and Github issue 1727, it seems that the functionality that I need was depredated, is there a recommended way to effective do something like mcmc.log_weights, mcmc.exec_traces or EmpiricalMarginal(trace_posterior=mcmc)?

Thanks for helping.