Does this community have an opinion on pseudo-extended MCMC? It seems like a simple tool that could be of use to this community, but I have yet to see anyone mention it — the paper’s examples are even written in Stan.
The results are quite encouraging, with the sampler navigating seemingly degenerate geometries with ease. Would this also help with geometries that induce divergences and the like?
If I had to voice some concerns, it would be the increased number of parameters in particular, but maybe also the amount of bookkeeping necessary; it’s fairly straightforward, but would be nice if it could somehow be automated as part of the fitting process, instead of causing bloat in the stan file.
Sampling from the posterior distribution using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to fully explore the correct posterior. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as an approach for improving the mixing of the MCMC sampler in complex posterior distributions. The pseudo-extended method augments the state-space of the posterior using pseudo-samples as auxiliary variables, where on the extended space, the MCMC sampler is able to easily move between the well-separated modes of the posterior.
We apply the pseudo-extended method within an Hamiltonian Monte Carlo sampler and show that by using the No U-turn algorithm (Hoffman and Gelman, 2014), our proposed sampler is completely tuning free. We compare the pseudo-extended method against well-known tempered MCMC algorithms and show the advantages of the new sampler on a number of challenging examples from the statistics literature.