Symmetric weakly informative priors for categorical predictors

One option is to use effects coding rather than dummy coding for the categorical predictors. For a binary predictor, effects coding means using -1/1 instead of 0/1 (which is dummy coding). For a categorical predictor, the dummy coding expands to a column of zeros and ones for every category except the reference category. Replace the zeros with -1’s and you have effects coding.

I’m reasonably certain that Agresti mentions this solution somewhere in his book “Categorical data analysis”.

Not mentioned to my knowledge by Agresti is a knock-on benefit of effects coding as opposed to dummy coding: you get the benefits of @andrewgelman’s recommendation to standardize continuous covariates by dividing by two standard deviations by just doing the “usual” thing of dividing by one standard deviation.

2 Likes