Hi all,
I’ve been trying to place a kind of “double-layer” priors on GP models. It is clear in the tutorial about how to set the prior distribution on lengthscale parameters (i.e. by calling model.kernle.set_prior(‘lengthscale’, dist.LogNormal(mean, variance)))
The problem I have is to place another distribution on the “LogNormal” dist. In mathematics,
sigma_i~N(1.,1.)
l_i~LogNormal(1., sigma_i)
where l_i represents the length scale parameter.
I see a somehow related post adding GP in mixed effects models. But I am not sure whether it is valid to change the kernel every time for updating.
Could anyone tell me or give me some hints about this? I really appreciate all your help and time.