BRM function suddenly stopped running and throws a range of different errors

I’m not quite sure I understand - aren’t mixed models appropriate to model both within- and between subjects population level effects?

Yes, they are. But I am referring to a situation where you have something like (X*Y*Z | subject) in the model formula and the interaction term X:Y:Z is not a within-subject variable (i.e., for each subject there is only one level of X:Y:Z). If so, that will cause problems. I have no idea if that is what your data are like, but seeing a full 4-way interaction term being used as group-level effects across two grouping factors (IDs and Subs) in an experimental study seems like this could be the case which, if so, might explain your slow model fit and need for a high adapt_delta.

2 Likes

Can you show the command you used to run the model?

But I am referring to a situation where you have something like (X*Y*Z | subject) in the model formula and the interaction term X:Y:Z is not a within-subject variable (i.e., for each subject there is only one level of X:Y:Z ).

I see. In my case, I have multiple trials per subject in each condition, so within subjects data for all predictors. Theoretically I can expect both by-subject variation and by-item variation for all of the predictors including their interactions, and this is reflected in the model output, where all of the terms explain some (if small) variance. So no way around the long sampling times I’m afraid.

Can you show the command you used to run the model?

Of course:

bayes_conf$dbMent <- brm(formula = ALLAccuracies ~ lg10JerksZ*allConditions*lg10OwnStimDiff_jerkZ +
                             (1 + lg10JerksZ*allConditions*lg10OwnStimDiff_jerkZ || allVidIDs) +
                             (1 + lg10JerksZ*allConditions*lg10OwnStimDiff_jerkZ || allSubs), 
                           data = dbMent, warmup = 1000, iter = 4000, 
                           cores = parallel::detectCores(),
                           chains = 4, control = list(adapt_delta = .90), 
                           prior = prior1, sample_prior = TRUE,
                           save_all_pars = TRUE)

Nothing unusual about that.

Can you do this command:

bayes_conf$dbMent <- brm(formula = ALLAccuracies ~ lg10JerksZ*allConditions*lg10OwnStimDiff_jerkZ +
                             (1 + lg10JerksZ*allConditions*lg10OwnStimDiff_jerkZ || allVidIDs) +
                             (1 + lg10JerksZ*allConditions*lg10OwnStimDiff_jerkZ || allSubs), 
                           data = dbMent, warmup = 1, iter = 1, 
                           cores = 1, chains = 1)

Does it produce the same output? And does it eventually run?

Is that stuff about make the only output brm produced when you ran the command?

1 Like

Actually, simplify the formula too.

bayes_conf$dbMent <- brm(formula = ALLAccuracies ~ lg10JerksZ, 
                           data = dbMent, warmup = 1, iter = 1, 
                           cores = 1, chains = 1)

What are the dimensions of dbMent?

(Edit: Also my intention was to grab a continuous predictor for lg10JerksZ – if that is a factor pick something else. I want the model to be super easy to compile regardless of the data)

1 Like

I think I know what the problem was now. After looking at str(dbMent) I found that my continuous variable lg10OwnStimDiff_jerkZ had mistakenly been converted to a character vector. I presume the model had difficulty dealing with 896 character levels.

after I specified

dbMent$lg10OwnStimDiff_jerkZ <- as.numeric(dbMent$lg10OwnStimDiff_jerkZ)

the model is running fine.

I have to say this is something that happened to me before, where I - by accident- notice that a variable which I explicitly specified to be, for example, numeric, had been converted to factor / character type and vice versa. I don’t know how this happens or how I could avoid it. But at least I’m aware of it now.

Thanks so much @bbbales2 and @andymilne for your continued help, I really appreciate it!

2 Likes