Hi; I’m interested in build a Hierarchical Bayesian multi-class Logistic regression model.

The problem I’m facing is that: I have a set of N respondents (a set of people); and each one has M number of observations (so the total number of data points are N*M) where N >> M.

There are also K classes in the sense of multi-class logistic regression.

So now; since each respondents are behaving differently and the naive way, would be train a regressor for each individual. But again individuals are so large and each individual has so few points and it’s unrealistic to do so.

I’m thinking a Hierarchical approach such that: I learn a K number of distributions (e.g. each is a normal such as N( beta_k, sigma) . Then, I can sample betas from this distribution, each beta is a regressor for one individual.

On the other hand; I also want to introduce correlation among classes as well so that it doesn’t necessarily respect [independence of irrelevant alternatives

A possible approach would be latent model in https://en.wikipedia.org/wiki/Multinomial_probit but with Hierarchical bayesian

But I’m new to this kind of model and also new to Pyro. Can anyone give me an example of similar problem coded in Pyro (using Variational inference preferred) or provide me some guidance ?

Thanks