Hi everyone,
I am working on writing a stan model for a Bayesian hierarchical mixed-effect model. I tried Cholesky decomposition on the group correlation matrix (L_u), but the results on L_u looks weird (attached below marked in blue). I am wondering if I did it correctly? Thank you for your help!
The code I wrote is below:
data {
int<lower=1> N; //number of data points
real y[N]; // time points, TX or T0
real x[N]; // log count
int<lower=1> J; // number of groups
int<lower=1, upper=J> id[N]; // vector of group indices
}
parameters {
vector[2] beta; // fixed effects intercept and slope
real<lower=0> sigma; // sigma of y
vector<lower=0>[2] sigma_u; // sigma of intercept and slope for random effects
cholesky_factor_corr[2] L_u; // Cholesky decomposition of the group correlation matrix (the square of the correlation matrix)
matrix[2,J] z_u; // intercept and slope for random effects
}
transformed parameters{
matrix[2,J] u; // intercepts and slopes of random effects of J pairs
u = diag_pre_multiply(sigma_u, L_u) * z_u; // generates varying intercepts and slopes from joint probability distribution
}
model {
real mu;
//priors
L_u ~ lkj_corr_cholesky(2); //Our choice of 2.0 implies that no prior info about the correlation btw intercepts and slopes
to_vector(z_u) ~ normal(0,1); // convert the matrix z_uu to a column vector in column major order.
sigma_u ~ normal(0,1);
sigma ~ exponential(2);
beta[1] ~ normal(0,5.1);
beta[2] ~ normal(0,0.85);
//likelihood
for (i in 1:N){
mu = beta[1] + u[1,id[i]]+ (beta[2] + u[2,id[i]]) * x[i];
y[i] ~ normal(mu, sigma);
}
}