# Efficient calculation of a determinant

Hi everyone,

I’m trying to obtain an efficient implementation of Bayesian logistic regression with the Jeffreys prior. My understanding is that the computational bottleneck is the calculation of the log-determinant log(|X’SX|), where X is the design matrix and S is a diagonal matrix, and would like to know whether it’s possible to speed up the computation of this log-determinant or the sampling of this model in general. I tried to apply Cholesky decomposition on the matrix X’SX but it does seem to help.

Thanks!

My model code:

``````data {
int<lower=0> N;                // number of observations
int<lower=0> p;                // number of covariates (including intercept)
int<lower=0,upper=1> y[N];  // setting the dependent variable (vote) as binary
matrix[N,p] X;        // design matrix
}

parameters {
vector[p] beta;
}

model {
vector[p] tmp = diagonal(cholesky_decompose(X'*diag_matrix(exp(X*beta)./square(1+exp(X*beta)))*X));
y ~ bernoulli_logit(X*beta); // model
target += 0.5*sum(log(tmp));
}
``````

Stan has builtin `log_determinant`. Determinant of a product is product of determinants so sum of log determinants. I think you want something like this

``````model {
y ~ bernoulli_logit(X*beta); // model

real log_det_X = log_determinant(X);
real log_det_beta = sum(X*beta - 2*log1p_exp(X*beta));
target += 0.5*(log_det_X + log_det_beta + log_det_X);
}
``````
1 Like

Thanks for the response! One follow-up question: usually X will be a non-square matrix and is it still possible to calculate its determinant with `log_determinant`?

Sorry, missed that point. No, but I think you can use `quad_form` for more efficiency

``````log_determinant(quad_form(diag_matrix(..),X))
``````
1 Like