Hi everyone,
I’m trying to obtain an efficient implementation of Bayesian logistic regression with the Jeffreys prior. My understanding is that the computational bottleneck is the calculation of the log-determinant log(|X’SX|), where X is the design matrix and S is a diagonal matrix, and would like to know whether it’s possible to speed up the computation of this log-determinant or the sampling of this model in general. I tried to apply Cholesky decomposition on the matrix X’SX but it does seem to help.
Thanks!
My model code:
data {
int<lower=0> N; // number of observations
int<lower=0> p; // number of covariates (including intercept)
int<lower=0,upper=1> y[N]; // setting the dependent variable (vote) as binary
matrix[N,p] X; // design matrix
}
parameters {
vector[p] beta;
}
model {
vector[p] tmp = diagonal(cholesky_decompose(X'*diag_matrix(exp(X*beta)./square(1+exp(X*beta)))*X));
y ~ bernoulli_logit(X*beta); // model
target += 0.5*sum(log(tmp));
}