Hello,
I run a brms model using Stan and got the following warning messages:

“Warning messages:
1: Rows containing NAs were excluded from the model.
2: In system2(CXX, args = ARGS) : error in running command
3: In file.remove(c(unprocessed, processed)) :
cannot remove file ‘/var/folders/_v/_dp_x44j2_gb3fg47_bgdrxw0000gn/T//Rtmpb11Vv7/filec6931a1c65a7.stan’, reason ‘No such file or directory’
4: There were 50 divergent transitions after warmup. Increasing adapt_delta above 0.8 may help. See http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup
5: Examine the pairs() plot to diagnose sampling problems”.

I tried changing the delta to 0.9999 by running “control = list(adapt_delta = 0.99999)” but nothing changes. I still get the divergent transitions warning.

Having to raise adapt_delta so high is usually a sign of some model misspecification (see Divergent transitions - a primer for a more in depth explanation and suggestions). In your case, with logistic regression a cauchy prior (0,10) on the intercept is definitely too wide tailed, so that’s the first thing I’d consider changing.

Hi @Atsev, it would be easier to troubleshoot whether your model is misspecified if you provided the data or explained it in more detail. For example, how many levels of each clustering variable do you have?

Here’s a screenshot of my variables. What I aim to do is the explore the effect of all the variables on Drum and keep Period and FocalWaibira as random effects. I decided to use brms because with my glmms I had issues of Singularity but I’m having more issues here to set up the model. Is this of any help?

Try more regularizing priors before touching adapt_delta.

Add a prior to your group-level varying effects (“sd” in brms syntax).

Remember that your linear predictors are all estimating in the logit space, which, for most intents and purposes, goes from -4 to 4. You have priors with pretty wide tails. Consider using normal(0,1) for your fixed and varying effects and normal(0,4) for your intercept.

Normal(0,4) is probably still too wide. When you convert that to the probability scale it implies that 0 and 1 are much more probable than other responses, which probably isn’t true.

For what it’s worth, a normal(0, 1.5) prior is pretty much flat on the probability scale. You might have some other, more specific idea about what to expect though, in which case you can make it even more specific.

Sorry maybe it wasn’t clear from the screenshot of my data I sent that my output is binomial (Drum= 0/1).
I read that cauchy is better to deal with binomial output. Also, I did run the prior (normal(0,1.5), class= “Intercept”), prior (normal(0,1), class=“b”)) but it doesn’t resolve the divergent transitions warning so I kept the delta to 0.99 (which now works)

The problem is probably to do with the varying intercepts that you have in your model. Try including prior(exponential(2), class = "sd"). Does that resolve the problem?