OK, it is clear that memory should increase with the data size, but I was asking why the consumption is so large in this specific example. From these posts (Dealing with memory issues in Markov chain style model - #2 by Bob_Carpenter) and (Memory issues with custom model - #4 by bbbales2) I understand that computations that connect the data and parameters will consume memory due to autodiff. If this is about 40 bytes per elementary computation, and 5000 exponentiations of a 5 x 5 matrix use 1 GB, this suggests that one of those matrix exponentiations involves a few thousand computations. I can believe this.
Even so, it’s not clear to me why the memory usage doesn’t appear go down when rstan::sampling()
returns?
Thanks,
Chris