I am currently modelling a data set of around 6000 people and need to compare 2 models via Bayes factors.
prior = c(prior(normal(0.6, 0.2), class = Intercept),
prior(normal(0.0, 0.3), class = "b"))
fit0 <- brm(bf(score ~ group * test_id * gender + (1|school_id) + (1|teacher_id), sigma ~ (1|student_ID)),
family = skew_normal(),
data = data,
prior = prior,
warmup = 1000,
iter = 5000,
chains = 4,
cores = parallel::detectCores(),
save_pars = save_pars(all = TRUE),
control = list(adapt_delta = 0.95, max_treedepth = 15)
)
fit1 <- brm(bf(score ~ group * test_id + gender * test_id + gender * group + (1|school_id) + (1|teacher_id), sigma ~ (1|student_ID)),
family = skew_normal(),
data = data,
prior = prior,
warmup = 1000,
iter = 5000,
chains = 4,
cores = parallel::detectCores(),
save_pars = save_pars(all =TRUE),
control = list(adapt_delta = 0.95, max_treedepth = 15)
)
bayes_factor(fit0, fit1)
It takes quite some while to run and the two models turn out fine. The problem is, that no Bayes factor is computed as the maximum of iterations is exceeded. If I run the above code and drop the heterogeneous assumption on sigma (choosing no distribution), I get a perfectly stable Bayes factor in seconds. I redid this computation already with 20000 and 2000 iterations, but no improvement whatsoever.
I also looked at the bayes_factor documentation online, but I did not find any useful parameter I could change. Do you have any suggestion on computing the Bayes factor with the desired distribution of sigma? Is it because I did not specify any prior distribution for sigma?
Thanks a lot!