Windows 10, brms version 2.15.0
Fitting a categorical model here, with 4 outcome categories, 29 population-level predictors (of which several have multiple df), 2 group-level effects (varying intercepts), with a total of 150 parameters to estimate.
Brms outputs these annoying little warnings:
Specifying global priors for regression coefficients in categorical models is deprecated. Please specify priors separately for each response category.
From an earlier exchange with @paul.buerkner I assume that this is harmless. The other warning is:
The global prior 'normal(0, 2.5) will not be used in the model as all related coefficients have individual priors already. If you did not set those priors yourself, then maybe brms has assigned default priors. See ?set_prior and ?get_prior for more details.
I assume that this is also harmless. Iāve pre-defined my priors by first creating a table using mypriors <- get_prior(modelformula, family = categorical, data = mydata)
and then setting common ānormal(0, 2.5)ā priors for everything in the classes ābā and āInterceptā as well as setting common āexponential(2)ā priors for everything in the āsdā class. Only for two coefficients in the ābā class did I set individual priors.
Brms is brilliant software, but if both of these warnings are harmless then it would be nicer not to get them ā a serious warning (such as one about divergent transitions) could easily be missed due to the flood of irrelevant ones.
With the whining thus over with, hereās my question: to my delight there was no notification of divergent transitions. However, one (1) of my 150 parameters has an Rhat of 1.01 (all the rest have 1.00). Are the results therefore unreliable? If they are, how might I avoid the problem upon refitting? Would increasing adapt_delta
help? Its present value is 0.95.
The model consists of 2500*4 post-warmup samples for a total of 10 000, and Iām not keen to increase this because even with just 15 000, the model objects (and loo posterior objects) become too large for my laptop to handle without running out of memory.