Stanfit object size too big for memory

Hello everyone,

I have a model that has 3 individual-level parameters for 60,000 observations (~200,000 parameters in total for the full model). The resulting stanfit object ends up being 7gb (warmup = 1000, iter =1500, chains = 4). I then save the object with saveRDS(). When I reload it, however, I get a memory allocation error and R crashes. Is there any way I can reduce the size of the object so I don’t run into this problem? I’ve already prevented all of the parameters that aren’t of interest from being saved in the object (include = FALSE, pars = c(…)), but I’m out of ideas otherwise.

Thanks!

Just this morning I shared some code on github to strip out extra parameters from a stanfit object.

Maybe you can use that to split it up into two or three stanfit objects?

Thanks for this Aaron.

I’ve already stripped on the extra parameters using the arguments include = FALSE and pars = c(…) when calling sampling(). I wonder if there’s a way to remove the warmup iterations?

save_warmup = FALSE

Wow. That’s easy. When I looked through the help file for sampling(), it didn’t show a save_warmup argument.

I see now that it’s included in the R documentation for stan(). I’ve been using the stan_model() to sampling() route. Thanks a lot Ben. This is perfect.