Modelling time varying ordinal response with time varying ordinal predictors_cross-lagged?

I have ordinal response and 22 ordinal predictors and one continuous predictor and 5 time points.
I am struggling what model would be a better model if I want to know the effect of predictors on the outcome while the predictors can be correlated with themselves as well.
I tried to cluster 22 down to 12 based on biology.
I thought about cross-lagged model but not sure if it is a right direction to go.
I appreciate your advice on modeling.
Thanks in advance. @paul.buerkner, @avehtari

1 Like

As projpred package doesnät support yet ordinal data, and you assume correlated variables I would do two things

  • The first reference model is the model with all predictors.
  • Using elpd_diff compare to the first reference all models which have one predictor dropped at time. If the elpd_diff is big then that predictor alone is important.
  • The second reference model is the model without any predictors.
  • Using elpd_diff compare to the second reference all models which have one predictor at time. If the elpd_diff is big then that predictor has predictive information.
  • Comparing these two lists you can see which predictors have some information but may have similar information as others and which predictors have some information but are not similar to others.

Aki, just to clarify,

do I understand you correctly when, in the second step, you take out a predictor, measure, and then put it back in before moving to the next predictor?

And in step four, you add one predictor, measure, remove the predictor, and then add a new predictor?

1 Like

Thanks for your response.
The data is longitudinal and repeated measures at least 6 time points that are not equal. Should I perform your suggestions for each time point?
Also, the predictor at t-1 may be associated with subsequent response Yt and also with itself. I still can perform your suggestions and repeat for all lag predictor and also cross-sectional predictors manually. I am a bit confused about how to model this. I have many analyses manually, is it really the only way to do it? Thanks again for your guidance.


I need to read carefully your Stan code for the model to understand the complexities of your model, but I don’t have time to do it right now.

time (1 to 6) 
22 predictors: A to V ordinal predictors
BODN : response variable
id: individuals' id 

long_data %<>%
  group_by(id) %>%
  arrange(time, .by_group = TRUE) %>%
     bodn_lag = lag(bodn),
     A_lag = lag(A)

f <- brm(bodn ~ mo(bodn_lag) +mo( A_lag) + ...+mo(V_lag) + mo(bodn_lag) | id) 
fA <- brm(A ~ mo(A_lag )+mo( BODN_lag) + ... + mo(A_lag | id)
fV<-brm(A ~ mo(A_lag )+mo( BODN_lag) + ... + mo(V_lag | id)

model <- brm(f + fA ...+fV+ set_rescor(TRUE), data = long_data,family=cumulative(),prior=prior)

The ideal in my data is to develop autoregressive and cross-lagged and apply your suggestion for features selection. But, I am not sure is this approach and syntax are correct.
I appreciate hints and advice on the equations/syntax here.

I tried the model cross-lagged as below, but it seems the syntax does not cover the ordinal response/predictor.
The error was

I move forward with @avehtari suggestion for now, hopefully we have more options for ordinal models for cross-lagged and auto-regression.

long_data %<>%
  group_by(idno) %>%
  arrange(followup, .by_group = TRUE) %>%
    bodn_lag = lag(rec_n_r_bodn),
    eye_lag = lag(rec_comb_seveye),

f <- bf(rec_n_r_bodn ~ mo(bodn_lag) +mo( eye_lag) + (mo(bodn_lag) | idno)) 
fA <- bf(rec_comb_seveye ~ mo(eye_lag )+mo( BODN_lag) + mo(eye_lag | idno))
model <- brm(f + fA+ set_rescor(TRUE), data = long_data,family=cumulative(),prior=prior)
Error: Currently, estimating 'rescor' is only possible in multivariate gaussian or student models.

There is actually an issue on github about multivariate ordinal models ( which explains why this is currently not possible.

1 Like

@paul.buerkner Thank you for your feedback. It seems complex as you mentioned in Github. We are thinking about how to approach my questions and data and when some results available I get back here and share. Thanks again.