N_eff BDA3 vs. Stan

Yes. Relevant keywords are super-efficient sampling, antithetic, and over-relaxation. Usually it’s really difficult to get n_eff>N although some toy examples can be found in the literature (too late in Finland, I’ll find some references tomorrow). Mike says that in NUTS “sampling from the trajectory also biases to states away from the initial point”. This is similar to over-relaxation (see, e.g., Abstract for ``Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation'') which seems to lead in super-efficient behavior in many cases in Stan.

1 Like