Potential reasons for Stan slowing down RStudio server other than CPU/RAM availability?

Hi guys,

I simulated a rather simple Stan model using 6 cores, which used around 6GB of RAM. The machine used to simulate has 126GB of RAM and 16 cores. I was contacted by colleagues that my simulations slowed-down their RStudio Server connections so significantly, that loading the starting pages took minutes rather than seconds. After I killed the jobs all worked as usual again.

Now my question: Given that the machine had 10 idle cores and around 120GB of RAM remaining, what could have slowed down the system so significantly? Is there any significant I/O going on during simulations?


No, unless perhaps if you specify sample_file or diagnostic_file but even then I would not expect RStudio to suffer. But then again, I usually do long-running jobs outside of RStudio anyway.

Memory, disk, or network contention. Those aren’t helped by idle core or RAM—it’s the bandwidth from RAM to CPU, from disk to memory, and from network to memory, each channel of which is limited.

1 Like