Suppose I have a full
k*k covariance matrix
Sigma. I can use singular value decomposition (SVD) or the eigendecomposition to express this as
U * Lambda * U' where U has size
k * k.
In many applications (eg multivariate normal distributions, Gaussian processes, etc), large
k means that this is unwieldy because I need to invert this matrix to get the precision matrix, which has cost O(
Now, suppose I want to construct a low-rank approximation to this, matrix which corresponds to a truncated SVD:
Sigma_alt = V * Lambda * V', where
V has dimension
[k, p] where
This can be obtained by truncating the SVD, but this requires calculating the whole SVD and then truncating, which has complexity O(
Is there some fast way of calculating this low-rank approximation which doesn’t have O(
k^3)? (I imagine it would be O(
Edit: I suppose that one way to do this would be to specify the low-rank matrices as parameters in Stan then fit them to the data at hand. Nonetheless, I’m still curious as to whether such an algorithm exists.