I’ve been trying to calibrate a complex model, and have broken it down to a piece that I’m not sure I understand. This may be a trivial misunderstanding - I’m certainly new to hierarchical models - but it seems like a strange sampler behavior and I wanted to ask about it here.

I’m comparing the results from using hmc-nuts to sample directly from a (hierarchical) prior to using normal_rng to generate samples. What I’ve found is that for the scale parameter of the normal distribution, the sampled prior doesn’t agree with the generated one - and for some reason the sampled prior avoids zero.

Here’s the minimal working example:

```
parameters {
real<lower=0> sigma_bar;
real<lower=0> sigma_sig;
real<lower=0> sigma;
}
model {
sigma_bar ~ normal(0.5,0.5);
sigma_sig ~ normal(0.5,0.5);
sigma ~ normal(sigma_bar,sigma_sig);
}
generated quantities {
real<lower=0> s_bar;
real<lower=0> s_sig;
real<lower=0> s;
s_bar = -1;
s_sig = -1;
s = -1;
while(s_bar <0)
s_bar = normal_rng(0.5,0.5);
while(s_sig < 0)
s_sig = normal_rng(0.5,0.5);
while(s < 0)
s = normal_rng(s_bar,s_sig);
}
```

I’d appreciate any help anyone can give on this - is this sampler behavior expected? If this is a problem, is there a way to avoid it? Am I correct in assuming that the two methods should give the same distribution?

Thanks!

(I’m running 4 chains of 1e7 samples to get this.)