Perhaps this is of interest. There’s a detailed comparison to ADVI. Claims to be ~10x faster.
Link: https://arxiv.org/abs/1706.02375v2 Here are the details:
Fast Black-box Variational Inference through Stochastic Trust-Region Optimization
Jeffrey Regier, Michael I. Jordan, Jon McAuliffe
(Submitted on 7 Jun 2017 (v1), last revised 5 Nov 2017 (this version, v2))
We introduce TrustVI, a fast second-order algorithm for black-box variational inference based on trust-region optimization and the reparameterization trick. At each iteration, TrustVI proposes and assesses a step based on minibatches of draws from the variational distribution. The algorithm provably converges to a stationary point. We implemented TrustVI in the Stan framework and compared it to two alternatives: Automatic Differentiation Variational Inference (ADVI) and Hessian-free Stochastic Gradient Variational Inference (HFSGVI). The former is based on stochastic first-order optimization. The latter uses second-order information, but lacks convergence guarantees. TrustVI typically converged at least one order of magnitude faster than ADVI, demonstrating the value of stochastic second-order information. TrustVI often found substantially better variational distributions than HFSGVI, demonstrating that our convergence theory can matter in practice.