I’ve been using brms to run a multivariate model. It’s a complex model with a large dataset (c.80k respondents with repeated observations and nested and crossed factors and four separate predicted variables). It takes a long time to run but it gets there eventually. Last year (2020) I ran this model with an earlier version of cmdstan and brms and the model worked and I was able to view the model using the summary command. I have re-run it this year with the latest version of packages installed in R (brms with cmdstan backend) and the modelling completes but this time I run into a vector memory limit error when trying to get a summary of the model (see below). I’m not sure why. The model is the same, so I suspect that there has been a change to the way the summary command operates? Does anyone have any similar experience or advice? Is there anyway to limit what summary is trying to do? I guess I could reduce the model object matrix through some sort of post modelling thinning, but not sure how to set this up - would appreciate also any insights into how to do this prior to summary(mod) etc.
> summary(mod)
Error in paste(calltext, collapse = " ") :
result would exceed 2^31-1 bytes
Error in paste(deparse(object, width.cutoff = 500L), collapse = " ") :
result would exceed 2^31-1 bytes
Error in paste(deparse(object, width.cutoff = 500L), collapse = " ") :
result would exceed 2^31-1 bytes
Error in paste(deparse(object, width.cutoff = 500L), collapse = " ") :
result would exceed 2^31-1 bytes
Error in paste(calltext, collapse = " ") :
result would exceed 2^31-1 bytes
Error in paste(calltext, collapse = " ") :
result would exceed 2^31-1 bytes
Error in paste(deparse(object, width.cutoff = 500L), collapse = " ") :
result would exceed 2^31-1 bytes
Operating System: Ubuntu
brms Version: 2020 model: 2.13/2.14; Latest model: 2.16