I’m running my script on a HPC, and I only have a certain amount of disk space available to me. When I sample from my posterior using PyStan it seems like Stan is caching the results/model, and the resulting file is over 50Gb somehow. This causes the program to be killed by the HPC.
Is there a way to stop stan from caching the samples/models?
I’m using the most up to date version of PyStan and httpstan, working on Ubuntu 20.