Right now the ESS and Rhat codes treat chains where a parameter value is approximately constant slightly differently. The ESS code computes autocorrelation estimators and computes an asymptotic variance/ESS estimator naively, while the Rhat code will reduce to 1 before any other calculation.

We should make this behavior consistent, but the question is what is the ideal behavior?

If the function whose expectation is being taken is truly constant than \mathbb{E}_{\pi}[f] = f exactly and the MCMC standard error should be zero. In that case we could report \hat{R} = 1 and \mathrm{ESS} = \infty.

On the other hand the trace might be numerically consistent with a constant because the chain was just dreadfully slow, in which case the autocorrelation/asymptotic variance/ESS estimators, as well as \hat{R} shouldnāt be trusted.

Without external information thereās no way to distinguish between these two possibilities. We know that we have actual constants in practice, coming from constrained types or generated quantities, and dynamic HMC *should* be fast enough that we shouldnāt see a trace with zero variation when it should be varying, which would suggest going with the former behavior. That said my preference is to be conservative and not make a claim between either behavior, instead throwing a `NaN`

and letting the user apply their domain expertise as to which possibility it might be.

Any thoughts on this matter? @avehtari? Daniel_Simpson?