Hi,
is there currently any way of specifying a maximum run time for sampling instead of only a number of iterations? I use stan on an HPC system and would like to sample as long as possible give a certain wall time. Since estimating run time based on a small pre-sample is quite shaky it would be extremely nice to be able to specify a maximum run time but I couldn’t find anything in that direction.
How do you implement sampling time constraints in practice (mainly working with brms now)?
Best,
Kevin