Memory usage details

I’m working with a Stan model, and I changed my computation workflow. I didn’t add any new data or parameters, but I did replace a simple computation in parameter transformation with a more complex one. Thus caused Stan to start using several times more memory; in fact it now would grab all available memory and swap and freeze.

Is there a description somewhere of how Stan uses memory, so that I can figure out what specifically causes this?

What was your memory consumption before this change?

Maybe you can post the model or at least the respective pieces. It is very unusual for Stan to use a lot of memory (at least in my work so far).

It’s a large data model, so it was taking about 6GB before the change. Not it takes several times as much (more than 18GB, but I don’t know how much more).

As for the change, to summarize: Each data point belongs in a group, and most parameters are estimated per group (with hyperpriors and some shared parameters).

So I have transformed param

for(i in 1:n) lambda[i] = param1[group[i]] * get_coefficient(param2[group[i]], param3[group[i]], param4)

value ~ poisson(lambda)

I changed the way the lambda is computed: it still uses the same parameters, but now it’s not complex.

You could always read the paper on autodiff: https://arxiv.org/abs/1509.07164. this is probably the closest thing we have to docs about that.