Diagnosting posterior geometry with divergences

Yes, that does sound plausible to me.

It’s a Markov chain, the next transition starts from the last sample. If you pick a diverging sample instead of a proper sample then the chain must continue from the difficult region and is more likely to get stuck. I think the divergent samples must go into a separate diagnostic output stream.
The other question is, what is a good “fixed smallish number of leapfrog steps?” The divergence detection heuristic is somewhat arbitrary and even in theory a divergence does not have a precise location. I guess the easiest solution would be to just pick a random sample from the rejected half of the trajectory. The sampler already records candidate samples as it build the trajectory; when the trajectory blows up you could print the latest candidate instead of just discarding it.

These sort of ideas have been discussed before:

It’s a very interesting problem.

2 Likes