Hierarchical model hyperparameter for sampling the covariance matrix per site from the LKJ distribution

Hi all

I am trying to set up a hierarchical model. My model needs to sample the site-specific mean and covariance between two variables at N different sites, while the mean and covariance also adhere to the global pattern between sites.

I am using the LKJ prior to sample the covariance matrix per site. However, the hyperparameter of the LKJ prior, eta, is only used to define the shape of the marginal distribution of Ri,j, which is always centered around 0 (R is the correlation matrix between my two variables). How can I implement a global hyperparameter, so that the center of the marginal distribution of Ri,j changes, according to the pattern met across sites?

@Amav In cvine method, you can set an arbitrary base distribution for partial correlation. Currently, it is set to Beta(eta', eta') but you can change it to any Beta(c0, c1) as you want. You will lose the nice property that P(corr_matrix) ~ det(corr_matrix)^(eta-1) but I believe that you don’t want that property. After getting the base distribution for partial correlation, you will need SignedStickBreakingTransform, which is a part of CorrCholeskyTransform (you can remove the Tanh part to get it). If you find this useful, you can request or make a PR to refactor CorrCholeskyTransform as ComposeTransform([TanhTransform(), SignedStickBreakingTransform()]). Then you can define a “relaxed” version of LKJ as follow

base_dist = Beta(..., ...)
lkj_dist =  TransformedDistribution(base_dist, SignedStickBreakingTransform())

Thank you very much for your response.

I have decided to follow the following approach I found in a different community: Hierarchical prior (for partial pooling) on correlation matrices? - #9 by bgoodri - General - The Stan Forums

Do you think that the method you suggested is going to be significantly faster?

1 Like

Using LKJ.log_prob directly might be faster than TransformedDistribution(base_dist, SignedStickBreakingTransform()) version, but I guess the difference is small.

Good to know. Thanks for your support!