Interpretation of prior's posterior in hierarchical GLM

I am fitting a hierarchical GLM to replicates of the same experiment as bayesian version of a mixed effects linear model, such that there are hyperpriors for the “regular” priors, which in turn shared by replicate-specific coefficients. It boils down to something like this:

    mu ~ normal(0,1);
    sigma ~ cauchy(0,5);

    beta1 ~ normal(mu, sigma);
    beta2 ~ normal(mu, sigma);

Now \beta_1 and \beta_2 may even have opposite signs, and I want to make a statement about the effect of the \beta coefficients in general, not the replicate-specific \beta_i, and since they are related by the shared prior I am thinking that I can refer to that distribution instead.

Is there a standard interpretation for what the estimates for \mu and \sigma (and their posteriors) mean, and how to interpret that in relationship to the actual parameters to which they define a prior? Beyond the vast literature on hierarchical models is there some specific literature on the interpretation of those priors?

Thank you.

Yes. Here \mu is the group-level mean and \sigma is the group-level standard deviation. What “group” means will depend on the specifics of your problem. I don’t understand what you mean by “actual” parameters which define the prior. These are hyperparameters, that index the probability distribution of random quantities in the model.

1 Like

Sorry about that sentence, it should be “to the actual parameters to which they define a prior”, not “the define…”. I edited the post because that sentence didn’t make sense as it was.

I understand that the \mu and \sigma are the parameters that define the group-level distribution, and I guess you are right that it will always depend on the specific problem. Here, the problem is that the replicates are a sort of nuisance parameter we’d like to “integrate over” in a sense. I think for that purpose, the statement I want to make is really about the group-level parameters and the replicate-specific ones can be essentially ignored, but I see how there may be other problems where the individual-level parameters may be just as important. Thanks.

1 Like