The message is not good. You should be concerned.
It occurs when a log is near 0, so it falls out of precision and log(0) gets evaluated.
This thread should give you hints about how to solve this issue:
In above model I tried to use weak priors: one prior for class intercept and one prior for class “b” for multiple ordinal predictor. Should I use several priors for each beta of predictors or one prior for class “b” is enough and just try other priors?
Thanks for your response.
What happen, if you specify init_r to a small value, eg. , init_r = 0.05.
Do you get the message “Log probability evaluates to log(0), i.e. negative infinity” only sporadic, or
does the whole run stop? (I was assuming the whole run stopped, my bad)
“The only problem is when it isn’t able to start at all,…” (Paul Buerkner).
I’d use a student_t prior, with df=3 and sd=10, and use one prior for class “b” as you did.
Excuse me for jumping in with one comment. Shouldn’t the assignment of the prior in the call to brm() be
prior = prior_bodn
and not, as it is shown,
prior_bodn = prior
Probably that is just a typo in the post, since I would expect some kind of error otherwise, but I thought I would mention it. I am not on a computer with brms installed, so I cannot test things at the moment.
Thanks for your reply, this was typo here, I had it correct (as you mentioned) in my syntax.
The error starts at the beginning before the first iteration in warmup but it continues to work and provides the results. However, I am not sure if this is OK and with a Rhat 1 and good number of sampling can I ignore this message or I should be concerned?
A lot of issues originates from data structure, I went through all discussions on this forum about similar problems, and I realized the first thing to check is data structure to make sure the class of the ordinal variables.