Brms memory (RAM) overload

Hello All,

It’s been a while since I last used brms, so I’m not sure whether this is a new or an old issue. However, I’ve noticed that brms is using all the available RAM. After a few hundred iterations, it becomes fully occupied. I tried running the program from my laptop, which has 32 GB of RAM, and it was filled after approximately 300 iterations. I also tried from my office computer, which has 254 GB of RAM. It filled up as well after a while, forcing me to stop the sampling. This issue occurred with different models, including the simple example below.

Here are the specs of my laptop:
Platform: x86_64-pc-linux-gnu (64-bit)
Ubuntu 22.04.2 LTS
RSstudio 2023.06.0, Build 421
R version 4.3.1
rstan 2.21.8
brms 2.19.0

Thank you in advance for any input or advice you may have.

# brms: An R Package for Bayesian Multilevel Models using Stan
# 4.1. A worked example
head(kidney, n = 3)

# 4.2. Fitting models with brms
fit1 <- brm(formula = time | cens(censored) ~ age * sex + disease + (1 + age|patient),
            data = kidney, family = lognormal(),
            prior = c(set_prior("normal(0,5)", class = "b"),
                      set_prior("cauchy(0,2)", class = "sd"),
                      set_prior("lkj(2)", class = "cor")),
            warmup = 1000, iter = 2000, chains = 4,
            control = list(adapt_delta = 0.95))

It looks like when using backend='cmdstanr', it works fine with no memory issues. It must be a bug specific to the default backend.

Same issue here. Models that used to run in the same computer a few months ago now fill up the RAM and swap completely, then crash the session (with rstan).

Downgrading brms to 2.17 or 2.18 (from 2.19) does not solve the problem, so I think it may lie elsewhere.

And as reported above, using cmdstanr as a backend solves it.

FYI, I had this same issue but it seems to be fixed with the latest rstan development version (2.26). Installation instructions in link below.

I still have this issue with the 2.26.1 version of rstan. Might be due to the very limited memory of my laptop (8gb RAM). Any advice on running these models with this little memory, or is my best option just to switch to a better machine?

Two things you may quickly want to try before switching machines is to use the backend = "cmdstanr" argument or, if you are using a complex model/large datasets/high amount of iterations, first try something simple like estimating a mean (x ~ 1) from a small number of observations.

I haven’t really paid attention to the memory footprint of Stan programs but I would be very surprised if a simple model causes trouble with your 8GB.