Hi,
I’m having a very difficult time getting a rather simple model to fit. (traces are wandering, but chains not mixing, and no “fuzzy caterpillar” of samples.). No matter what I try (centered, non-centered, diffuse priors, informative priors) I can’t get good posterior samples.
I eventually want to add many. more things to this model, but need to get the basic version working first.
The base model
data {
int<lower=0> N;
vector[N] y;
int<lower=0> J; // Number of groups
array[N] int group;
}
parameters {
real<lower=0> a0;
real<lower=0> sg;
real<lower=0> a_group_tau;
vector<lower=0>[J] a_group;
}
model {
a0 ~ normal(15, 0.1); // Informative prior taken from data
sg ~ normal(0, 0.5); // Informative prior taken from data
a_group_tau ~ normal(0, 0.1); // Informative
a_group ~ normal(0, a_group_tau);
for (n in 1:N) {
real mu = a0 + a_group[group[n]];
y[n] ~ normal(mu, sg);
}
}
Here is the non-centered model:
// non-centered version
// informative priors
data {
int<lower=0> N;
vector[N] y;
int<lower=0> J; // Number of groups
array[N] int group;
}
parameters {
real<lower=0> a0;
real<lower=0> sg;
real<lower=0> group_scale;
vector<lower=0>[J] group_eta;
}
transformed parameters {
vector[J] a_group;
a_group = group_scale * group_eta;
}
model {
a0 ~ normal(15, 0.1); // Informative prior taken from data
sg ~ normal(0, 0.5); // Informative prior taken from data
group_scale ~ normal(0,0.5);
group_eta ~ normal(0,1);
for (n in 1:N) {
real mu = a0 + a_group[group[n]];
y[n] ~ normal(mu, sg);
}
}
The posterior samples I get look “reasonable”. They’re in the ranges I would expect. But, the R-hat is bad, and trace plots look terrible.
My guess is that somehow the math is oscillating between putting weight into the population mean and the group means.
What I’m looking for is some guidance on how to resolve this. What steps can I take to find better priors, or a better way to parameterize my model. I plan to add a few more random effects, and would love to have a principled workflow on how do that well.
Any and all suggestions would be appreciated!!