I am very, very new to Bayesian LMM and i have experimenting with brms for the past month or so.
I am running the following model:
brm(lnTL~ZTD_within_player+ZTD_between_player+I(ZTD_within_player^2)+I(ZTD_between_player^2)+(1+ZTD_within_player|PLAYER)+(1|session), iter=10000, prior=prior1, control = list(adapt_delta = 0.95, max_treedepth=15), data=Dataset)
I am computing the global grand mean in the expected value of lnTL whilst ignoring PLAYER and session effects:
grand_mean_ZTD_within_player_dist ← LMM.1bayes %>% epred_draws(newdata = expand_grid(ZTD_between_player = 0,ZTD_within_player = seq( -2, 2, by = 0.01)),re_formula = NA)
plot_grand_mean_ZTD_within_player ← ggplot(grand_mean_ZTD_within_player_dist, aes(x = ZTD_within_player, y = .epred)) + stat_lineribbon() + scale_fill_brewer(palette = “Reds”) + labs(x = “ZTD_within_player”, y = “Predicted lnTL”, fill = “Credible interval”) + theme_clean() + theme(legend.position = “bottom”)
and this works fine
and I proceed to incorporate PLAYER and session effects into my predictions so as to create conditional effects for specific PLAYERS and sessions that already exist in the data, incorporating their PLAYER and session-specific deviations in slope and intercept.
all_PLAYER_session_ZTD_within_player_dist ← LMM.1bayes %>% epred_draws(newdata = expand_grid(ZTD_between_player = 0, ZTD_within_player = seq(-2, 2, by = 0.01), PLAYER = levels(Dataset$PLAYER), session = levels(Dataset$session)), re_formula = NULL)
This where I get “cannot allocate vector of size 35.5 Gb” and the analysis stops
I re-run reducing the iter to 3000 only to get the same msg but for 11.8Gb
I googled it but the solutions to increase memory limit just failed.
I am running an 64bit Intel(R) Core™ i5-3470 CPU 3.20 GHz with 8Gb RAM and R v4.0.0.
The PLAYER factor is n=14, the session factor is n=47 and the level-1 observations are 632
Any suggestions please?