Hierarchical ordered multinomial regression

I have an hierarchical ordered multinomial probit regression as one part of my model and I’m looking for a sensible way to deal with the cut points. I have three categories \{1,2,3\}, and thus two (interior) cut points c_{1.5},c_{2.5}. Section 15.2 in the Gelman/Hill textbook simply let’s each cut point vary by group (student in their case), but does not talk about how to ensure c_{j \hspace{0.3em} 1.5} < c_{j \hspace{0.3em} 2.5} for all j.

In my model, I have multiple grouping variables, for which I want varying intercepts (cut points) and one grouping variable for which I want varying slopes (with correlation matrix including the intercept/cut points).

I have come across this great case study about ordinal regression by @betanalpha. I was wondering if the (arbitrary?) “achor point” \phi, which is set to zero in the final model, could be used for this. Something like \phi_j \sim N(0, \sigma_\phi) ?!

You have multiple choices each corresponding to different models.

If you let the “anchor point”, or in a more general ordinal regression a model like \phi = \beta * x, vary between groups then you can only change the offset of the latent logistic distribution. This requires that all of the probabilities change in a coherent way.

If you allow the cut points to vary by group then you essentially allow different latent distributions which allow for a richer variation of probabilities. The issue here is allowing varying within the constraints, as you note. One option would be to parameterize the cut points as a left-most point and then log differences from that point (which is how ordered vectors work behind the scenes) and then allow variation on that unconstrained space, but then you have a subtle asymmetry between the models for the different cut points.

2 Likes