Conditional monotonicity in cumulative probit model

  • Operating System: Windows 11
  • brms Version: 2.20.4

This is my first post on the Stan Forum and I am just discovering all the possibilities in brms and with Bayesian analysis in general.

I am working with an ordinal outcome variable and want to use a cumulative probit model. I have two predictors of interest, one categorical with 4 categories and one ordinal with 4 levels. My question concerns modeling interaction effects between the two using the technique for monotonic effects of ordinal predictors described in Bürkner and Charpentier (2020). Specifically, I want a model that assumes different monotonic effects of my ordinal predictor in each category of my categorical predictor (i.e., x is conditionally monotonic on z), which is achieved through cell mean coding in brms. According to this post, in a linear model this is simply done with: formula = 0 + z + z:mo(x). To my understanding and based on the error messages I am receiveing, this syntax will not work when specifying a cumulative probit model as it is not possible to remove the intercept.

Does anyone know or have any ideas for how to model conditional monotonicity in ordinal models?

Thanks,
Randi

The problem is you cannot remove the \beta_0 intercept from an ordinal brm() model (which has to do with identification purposes). As a consequence, you cannot use the y ~ 0 + ... syntax. In your case, try using formula = 1 + z + z:mo(x) instead.

Thank you, this seems to give me what I want. For some reason, I only considered either the index variable approach or the dummy variable approach. Your suggestion keeps the b0 intercept, but we still get monotonic effects of x in each category of z (using some data from psych::bfi just for illustrating the output):

 Family: cumulative 
  Links: mu = probit; disc = identity 
Formula: N1 ~ 1 + gender + gender:mo(education) 
   Data: testdf (Number of observations: 2558) 
  Draws: 4 chains, each with iter = 2000; warmup = 1000; thin = 1;
         total post-warmup draws = 4000

Population-Level Effects: 
                    Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
Intercept[1]           -0.64      0.08    -0.79    -0.48 1.00     3618     2650
Intercept[2]            0.00      0.08    -0.15     0.17 1.00     3775     2614
Intercept[3]            0.40      0.08     0.25     0.57 1.00     3760     2700
Intercept[4]            0.97      0.08     0.81     1.13 1.00     3711     2790
Intercept[5]            1.57      0.08     1.41     1.74 1.00     3913     2980
gender2                 0.20      0.10     0.02     0.40 1.00     3326     2822
gender1:moeducation    -0.00      0.03    -0.06     0.05 1.00     3676     2708
gender2:moeducation    -0.05      0.02    -0.10    -0.01 1.00     3122     2826

Simplex Parameters: 
                        Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
gender1:moeducation1[1]     0.26      0.20     0.01     0.74 1.00     4986     2232
gender1:moeducation1[2]     0.24      0.19     0.01     0.70 1.00     6081     2477
gender1:moeducation1[3]     0.24      0.19     0.01     0.68 1.00     5886     3220
gender1:moeducation1[4]     0.25      0.19     0.01     0.69 1.00     4902     2931
gender2:moeducation1[1]     0.24      0.17     0.01     0.62 1.00     3327     1798
gender2:moeducation1[2]     0.21      0.16     0.01     0.58 1.00     4082     2110
gender2:moeducation1[3]     0.33      0.19     0.03     0.72 1.00     4885     2653
gender2:moeducation1[4]     0.21      0.16     0.01     0.60 1.00     6592     2872

Family Specific Parameters: 
     Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
disc     1.00      0.00     1.00     1.00   NA       NA       NA

Draws were sampled using sampling(NUTS). For each parameter, Bulk_ESS
and Tail_ESS are effective sample size measures, and Rhat is the potential
scale reduction factor on split chains (at convergence, Rhat = 1).

Unrelated, your blog posts on the cumulative probit model have been so helpful for my understanding of these kinds of models. Such a valuable resource – thank you for making such excellent and accessible posts.

1 Like