Jacobian of softmax tansformation of a (n-1 degrees of freedom) unbounded parameter

Late to the party, but I’ll add a specific answer, without having to call log_determinant (which is costly and introduces problems).

If we have \mathbf{v} \in \mathbb{R}^{K-1} and then prepend 0 as the first element, we obtain a K simplex \mathbf{x} \in [0,1]^K, \sum_{i=1}^K x_i = 1 via softmax as

s = 1 + \sum_{i=1}^{K-1} \exp (v_i), \ x_1 = \frac{1}{s}, \ x_k = \frac{\exp (v_{k - 1})}{s}

We can then show that the Jacobian determinant is:

\frac{\exp \sum_{i=1}^{K - 1} v_i}{s^{K}}.

and so the log determinant is

\sum_{i=1}^{K - 1}v_i - K * \log s

A full derivation can be found in Section 5 of https://arxiv.org/pdf/2211.02383.pdf and
an implementation of the transform as a Stan fuction is:

functions {
  vector simplex_constrain_softmax_lp(vector v) {
     int K = size(v) + 1;
     vector[K] v0 = append_row(0, v);
     // Jacobian
     target += sum(v) - K * log_sum_exp(v0);
     return softmax(v0);
  }
}
4 Likes