Hello,
Please excuse my inexperience on this topic.
I am fitting a multivariate mixed model in order to find some scoring predictive mechanism.
I have a 2 negative binomials and lognormal families
I used shrinkage priors
model_mult <- brms::bf(x ~. + | Id ,family = negbinomial()) + brms::bf(y~. + | Id,family = negbinomial()) + brms::bf(z~. + | Id, family = lognormal())
convergence issues when using binomial for x outcome
chains = 2,
cores = 2,
prior = priors
)
did i set the priors correctly? if not do i need to specify responses
Fixed effect confidence intervals are extremely large, any specific reason this could be happening?
I tried using the binomial distribution for one of the outcome and had convergence issues, I ended up using negativebinomial , any reason this could be happening?
Thank you so much
I am sorry for asking all these questions
As a convenience feature, brms will automatically pass the priors on b to the indiviual univariate models if you just specifiy a global prior. See prior_summary(fit_ml) to verify.
What is “extremely large” for you? It could be because of the lack of enough data or because shrinkage was not strong enough.
There are a lot of reasons why convergence issues can occur. Please keep in mind that binomial and negbinomial don’t work on the same type of counts (bounded vs. unbounded). Choose one of the two that matches your understanding of the generative model of your data (and specify addition argument trials when fitting binomial models).
So all I need is to set priors this way : c(set_prior(“horseshoe(1, par_ratio=0.2)” ; and brms automatically applies to the b’s . ?
How about the intercepts? I have been reading papers advising against using cauchy priors, any thoughts on that?
You are right. How could I make skrinkage strong enough ? I have over 20 predictors. I am assuming by adding a large par_ratio value.