I have the model below, which I run with 4 chains. It’s a logit-normal model with normal priors for the mean and uniform for the variance. Even after 4000 iterations with 1000 warmups the chains don’t converge at all. First I get an error message saying
Log probability evaluates to log(0), i.e. negative infinity.
Then I noticed in the diagnostic plots that while the beta_z parameters somewhat move around, y_sigma remains fixed in every sample in every chain. It’s constant at 5998. Finally I get the error that
The largest R-hat is Inf, indicating chains have not mixed.
and that every transition is divergent.
What is wrong?
data {
int<lower = 1> N;
int<lower = 1> M;
vector[N] y;
matrix[N, M] X;
cholesky_factor_cov[M, M] prior_L;
vector[M] prior_betas;
int<lower = 1> test_N;
matrix[test_N, M] test_X;
}
parameters {
vector[M] beta_z;
real<lower = 0> y_sigma;
}
model {
vector[M] beta = prior_betas + prior_L * beta_z;
beta_z ~ normal(0, 1);
y_sigma ~ uniform(0.5, 2.5);
for (n in 1:N) {
target += -pow((logit(y[n]) - (row(X, n) * beta)) / (sqrt2() * pi()), 2.0);
target += -log(y[n] * (1 - y[n]));
target += -log(y_sigma * sqrt(2 * pi()));
}
}
generated quantities {
vector[N] pred;
vector[test_N] test_pred;
vector[M] beta = prior_betas + prior_L * beta_z;
for (n in 1:N) {
pred[n] = inv_logit(normal_rng(row(X, n) * beta, y_sigma));
}
for (n in 1:test_N) {
test_pred[n] = inv_logit(normal_rng(row(test_X, n) * beta, y_sigma));
}
}