Hi, I’ve been trying to find an error in my model for a few days, and I’m finally at the point where I need some help from community. I’m trying to fit a simple multilevel model with a single group-level predictor Z and a single individual predictor X. This is the way I specify the model:
bayes_model = "
data{
int<lower=1> J; //number of clusters
int<lower=1> N; //number of individuals
int<lower=1,upper=J> clust[N]; // cluster for each individual
vector[N] y; //vector of responses
vector[N] X; //vector of individual covariates
vector[J] Z; //vector of cluster characteristics
}
parameters {
real<lower=0> sigma_clust; // cluster-level SD
real<lower=0> sigma_ind; //individual-level SD
real gamma0; // group-level intercepts
real a[J]; //random effects
real gamma1; // group-level slope
real beta; // individual slope
}
transformed parameters {
vector[J] alpha;
vector[N] y_hat;
for (i in 1:J) {
alpha[i] = a[i] + Z[i] * gamma1;
}
for (i in 1:N) {
y_hat[i] = alpha[clust[i]] + X[i] * beta;
}
}
model {
sigma_clust ~ inv_gamma(0.001, 0.001); //priors
sigma_ind ~ inv_gamma(0.001, 0.001);
gamma0 ~ normal(0, 25);
gamma1 ~ normal(0, 25);
beta ~ normal(0, 25);
a ~ normal(gamma0, sigma_clust);
y ~ normal(y_hat, sigma_ind);
}
"
When I use simulated data with true values of parameters gamma0 = 0.0, gamma1 = 0.1, beta = 0.2, I get completely wrong estimates for gamma0 (far from 0) and gamma1, while lmer function in R is much more accurate. I’ve tried different parameters for number of iterations, maximum tree depth and adapt_delta, getting the same results. Is there some kind of obvious mistake, or maybe there is a way to reparametrize the model? Thanks a lot.