I understand that the KL divergence is used in Stan’s VI implementation. Is it possible to add a custom divergence measure instead?
Unfortunately, I don’t think that’s possible, but would double check with @jonah.
Yes in C++ part of the implementation. If you want to do that, I can point to the relevant files to edit.
Alternatively you can use any external VI implementation that allows providing custom log density and gradient functions and then you can let RStan or PyStan to compute just those. If you are interested in this option, I can point to the example code doing that.