Memory problem in brms hierarchical model

Hi everyone

I am very, very new to Bayesian LMM and i have experimenting with brms for the past month or so.
I am running the following model:

LMM.1bayes<-
brm(lnTL~ZTD_within_player+ZTD_between_player+I(ZTD_within_player^2)+I(ZTD_between_player^2)+(1+ZTD_within_player|PLAYER)+(1|session), iter=10000, prior=prior1, control = list(adapt_delta = 0.95, max_treedepth=15), data=Dataset)

I am computing the global grand mean in the expected value of lnTL whilst ignoring PLAYER and session effects:

grand_mean_ZTD_within_player_dist ← LMM.1bayes %>% epred_draws(newdata = expand_grid(ZTD_between_player = 0,ZTD_within_player = seq( -2, 2, by = 0.01)),re_formula = NA)

plot_grand_mean_ZTD_within_player ← ggplot(grand_mean_ZTD_within_player_dist, aes(x = ZTD_within_player, y = .epred)) + stat_lineribbon() + scale_fill_brewer(palette = “Reds”) + labs(x = “ZTD_within_player”, y = “Predicted lnTL”, fill = “Credible interval”) + theme_clean() + theme(legend.position = “bottom”)

plot_grand_mean_ZTD_within_player

and this works fine

and I proceed to incorporate PLAYER and session effects into my predictions so as to create conditional effects for specific PLAYERS and sessions that already exist in the data, incorporating their PLAYER and session-specific deviations in slope and intercept.

all_PLAYER_session_ZTD_within_player_dist ← LMM.1bayes %>% epred_draws(newdata = expand_grid(ZTD_between_player = 0, ZTD_within_player = seq(-2, 2, by = 0.01), PLAYER = levels(Dataset$PLAYER), session = levels(Dataset$session)), re_formula = NULL)

This where I get “cannot allocate vector of size 35.5 Gb” and the analysis stops

I re-run reducing the iter to 3000 only to get the same msg but for 11.8Gb

I googled it but the solutions to increase memory limit just failed.
I am running an 64bit Intel(R) Core™ i5-3470 CPU 3.20 GHz with 8Gb RAM and R v4.0.0.
The PLAYER factor is n=14, the session factor is n=47 and the level-1 observations are 632
Any suggestions please?

Hello, what is the result of this command:

newdata = expand_grid(ZTD_between_player = 0, ZTD_within_player = seq(-2, 2, by = 0.01), PLAYER = levels(Dataset$PLAYER), session = levels(Dataset$session))

Does this make the newdata object very large? All session by player levels are 14*47 = 658, and then seq(-2,2,0.01) is 401, so 263858. So there are 263858 parameters to be predicted, and 10,000 samples of each. If this is the cause of the error maybe you could reduce the range of ZTD_within_player (0.1 instead of 0.01 for example), reduce the number of samples if you don’t need them, or only predict a subset of session player combinations?

Hi AWoodward, thanks for the reply
It is definately the
newdata = expand_grid(ZTD_between_player = 0, ZTD_within_player = seq(-2, 2, by = 0.01), PLAYER = levels(Dataset$PLAYER), session = levels(Dataset$session))

that causes the error, I took iter down to 2000 and also ZTD_within_player (-2, 2, by=0.5)
but kept the configuration of the grouping effect (PLAYER and session) in a fresh R session and It produced the graph (which I did not save at the time). When I tried to increase ZTD_within_player (-2, 2, by=0.25) to see if I can get a smoother graph, it started again to display the same type of error although now it is “cannot allocate vector of size 411.7 Mb” instead of some ~11Gb that was previously. Also the error now appear subsequently when I call to plot the all_PLAYER_session_ZTD_within_player_dist

Tried some googling on the use of RAM memory in such R tasks but none of the solutions helped.

PS: The thing is that I’d like both PLAYER and session random effects to be present, but if I can’t work around it i’d prefer to show posterior distributions for a specific PLAYER in order to reduce the ammount of parameters. How would I then manipulate PLAYER = levels(Dataset$PLAYER)
in
(newdata = expand_grid(ZTD_between_player = 0, ZTD_within_player = seq(-2, 2, by = 0.01), PLAYER = levels(Dataset$PLAYER), session = levels(Dataset$session))
to specify one particular PLAYER (say for example “4”?)