Problem in "MCMC" Sampling of Categorical Variable

I’m trying to implement some type of a mixture model with multiple components, thus want to use sampling over a categorical variable to devide the data points into clusters randomly. However, when I try to sample dist.Categorial through MCMC, even without any observations, the sample sizes suddenly changes on the second sample. The following code reproduces it.

What am I doing wrong? Should I approach this differently?
Note that just using dist.Categorical outside any model works fine.

import pyro
import pyro.distributions as dist
from pyro.infer.mcmc.api import HMC, MCMC, NUTS

def model():
    classes = pyro.sample("classes",dist.Bernoulli(torch.tensor([[0.3,0.7], [0.3,0.7],[0.3,0.7], [0.3,0.7],[0.3,0.7], [0.3,0.7],[0.3,0.7], [0.3,0.7]])))
    print(classes.shape)

mcmc = MCMC(HMC(model, target_accept_prob=0.8),
            num_samples=10,
            warmup_steps=10,
            num_chains=1)
mcmc.run()

See Sample size changes over NUTS MCMC sampling.

HMC/NUTS requires models with continuous parameters. If a model has discrete parameters, then HMC enumerates them in parallel to compute the log density of a model execution trace. This parallel enumeration requires designating and using additional batch dims to record subsequent parameter values in the model for each realization of the categorical variable.