Hello!
I have a financial time series problem where I assume that the covariance matrix at any point is given by a combination of the asset volatilities and the correlation matrix, ie
Sigma[t] = quad_form_diag(correl_matrix, vol[t])
where correl_matrix
is an [n.n]
correlation matrix and vol
is a length n
vector.
I later have a likelihood at each point that is multivariate Student-t.
The likelihood function involves the term
(r[t] - mu)' * inverse(Sigma) * (r[t]-mu)
= (r[t] - mu)' * inverse(quad_form_diag(correl_matrix, vol[t])) * (r[t]-mu)
Obviously I don’t want to re-calculate inv(Sigma)
(either implicitly or explicitly) at each time step due to the O(n^3) complexity.
Is there a recommended to repeatedly calculate likelihoods with these quadratic forms involved?
The algebraic way would be keeping the inverse of the correlation matrix and then re-using it repeatedly, eg
(r[t] - mu)' * quad_form_diag(inverse(correl_matrix), inverse(vol[t])) * (r[t]-mu)
…but I can appreciate that using the direct inverse might not be the most stable or efficient way to do it.
I know about the mdivide
functions and could work with the lower triangular decomposition of correl_matrix
, but I’m not sure whether repeatedly calling eg mdivide_left_tri_low(correl_matrix_L * vol[t], r[t] - mu)
will incur a much higher operations count
Thanks!