I don’t know if your question was directed towards me, but the Householder parameterization in Rotation Invariant Householder Parameterization for Bayesian PCA is essentially the same approach (though numerically stable implementation requires LAPACK-like subroutines). It has a unit Jacobian determinant, so geometrically, sampling from the invariant distribution of a semi-orthogonal matrix is no different from an IID standard normal distribution. But it does have the usual singularity at 0 that occurs when sampling a vector and normalizing it, and when N=K, the smallest such vector is length 1, so the problems raised in A better unit vector of divergences for low-dimensional unit vectors could be quite bad. There’s a relatively straightforward modification in this case that improves the geometry.
In terms of number of operations, generating a dense semi-orthogonal matrix from this representation would require I think \mathcal{O}(K^2 N) operations, which is the sample complexity as QR decomposition itself, but with a lower constant. In principle one never needs to form the dense matrix through, since LAPACK-like routines can efficiently operate on the factorized representation that can be generated from the unconstrained parameterization in O(K N) operations.
To my knowledge no empirical comparison of the performance of this method with other parameterizations has been done. I’m working on this.