Rienman Manifold HMC / MALA? (for multivariate probit model)

I was talking to @charlesm93 and he mentioned that there is some prototype C++ code for Reinman Manifold Monte Carlo somewhere. I was wondering if someone could link to this or post any examples on the forums?

1 Like

If you mean the methods described in

Girolami, M. and Calderhead, B., 2011. Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology) , 73 (2), pp.123-214.

then I believe the relevant parts of the Stan code base implementing some of the key components specific to these methods are

Specifically the softabs_metric.hpp file implements a position-dependent metric computed from an eigendecomposition of the Hessian of the negative logarithm of the target density as described in

Betancourt, M., 2013. A general metric for Riemannian manifold Hamiltonian Monte Carlo. In Geometric Science of Information: First International Conference, GSI 2013, Paris, France, August 28-30, 2013. Proceedings (pp. 327-334). Springer Berlin Heidelberg. ([1212.4693] A General Metric for Riemannian Manifold Hamiltonian Monte Carlo)

The softabs_point.hpp file implements a data structure to record the state for a Riemannian manifold HMC chain, including caching some of the quantities derived from the position and momentum coordinates when computing the metric and target density and their derivatives, to avoid redundant recalculation.

The impl_leapfrog.hpp file implements the generalized leapfrog method uses to numerically integrate the dynamics associated with the non-separable Hamiltonian used in Riemannian manifold HMC methods, see for example the two papers above for more details, or

Leimkuhler, Benedict, and Sebastian Reich. Simulating Hamiltonian Dynamics. No. 14. Cambridge University Press, 2004. APA

The fixed point solver used for the implicit steps in this implementation uses a fixed number of steps with no convergence checking and the resulting integrator steps are not guaranteed to be time-reversible. EDIT: The fixed point solver used for the implicit steps in this implementation uses a fixed number of steps with convergence checking only for early termination, but no explicit error handling when convergence is not reached in the maximum number of fixed point steps, and the resulting integrator steps are not guaranteed to be time-reversible. Ideally a robust implementation should use convergence checks plus an explicit check for time reversibility as described for example in

Zappa, E., Holmes‐Cerfon, M. and Goodman, J., 2018. Monte Carlo on manifolds: sampling densities and integrating functions. Communications on Pure and Applied Mathematics , 71 (12), pp.2609-2647.

While the code implementing the key components for running a Riemannian manifold HMC method are present, I don’t believe they are exposed in any Stan interfaces, and I am also not sure if Stan Math’s automatic differentiation implementation currently supports computing the third-order derivatives needed for propagating derivatives through the SoftAbs metric.

In case you are interested in implementations of these methods more generally, there are working implementations of Riemannian manifold HMC methods in a Python package Mici (GitHub - matt-graham/mici: Manifold Markov chain Monte Carlo methods in Python) I develop, including both the general scheme described in Girolami and Calderhead (2011) and using the specific SoftAbs metric described by Betancourt (2013). There is a Jupyter notebook showing how to run a Riemannian manifold HMC method on a toy example at manifold_lifting/Two-dimensional_example.ipynb at master Ā· thiery-lab/manifold_lifting Ā· GitHub

4 Likes

Thanks for sharing this @matt-graham , it is nice to see progress on this topic which piqued my interest with the original paper but has not been explored much. I will take a look at mici and your notebook and might come back to you with some questions if I may!

1 Like

Yep very happy to answer any questions about Mici generally / the notebook I posted a link to!

Thanks for this!

I’m currently looking at using the Monge metric and implementing LMC as this doesn’t require numerical integration and can be done explicitly so it is more efficient (see https://proceedings.mlr.press/v151/hartmann22a/hartmann22a.pdf)

There’s also an explicit integrator for R-HMC here:

I’m computing all the derivatives for my model manually. I got a huge speed up from doing this compared to using Stan math library autodiff (even just for gradients)