I fit a model using
brms. If I set
iter high, running
loo fills memory, ultimately failing with the error
`Error: cannot allocate vector of size 5.6 Gb`.
I have 32GB of physical RAM and 24GB cache.
loo completes just fine if
iter is lower, but then the effective sample size is too low. Running
loo(fit, pointwise=TRUE) never completes (I stopped it after 8 hours on four 3.2 GHz CPUs).
The effective samples size for some parameters is ~15 times lower than the total number of samples. So I wonder if the memory problem could be avoided by removing “ineffective samples”? A simple solution would be thinning, but perhaps there are more clever solutions to keep even more effective samples.
So my question is: is there a way to do
loo here? Thanks in advance!