# Constructing a low-rank approximation to a covariance matrix

#1

Suppose I have a full `k*k` covariance matrix `Sigma`. I can use singular value decomposition (SVD) or the eigendecomposition to express this as `U * Lambda * U'` where U has size `k * k`.

In many applications (eg multivariate normal distributions, Gaussian processes, etc), large `k` means that this is unwieldy because I need to invert this matrix to get the precision matrix, which has cost O(`k^3`).

Now, suppose I want to construct a low-rank approximation to this, matrix which corresponds to a truncated SVD:
`Sigma_alt = V * Lambda * V'`, where `V` has dimension `[k, p]` where `p` < `k`.

This can be obtained by truncating the SVD, but this requires calculating the whole SVD and then truncating, which has complexity O(`k^3`).

Is there some fast way of calculating this low-rank approximation which doesn’t have O(`k^3`)? (I imagine it would be O(`kp^2`).

Thanks!

Edit: I suppose that one way to do this would be to specify the low-rank matrices as parameters in Stan then fit them to the data at hand. Nonetheless, I’m still curious as to whether such an algorithm exists.

#2

Not in Stan, nor in Eigen. Also, if you pass a low-rank covariance matrix to `multi_normal_lpdf` or similar functions, it would throw an error. The best we have is Cholesky factors and `multi_normal_cholesky_lpdf`. But the Cholesky factorization algorithm is better than it used to be be and we might be able to do it on a GPU in the next release or two.

#3

Thanks! Appreciate the fast response.