I am building a multilevel model for panel data with three layers. Since my dependent variable is a count variable I want to construct a negative binomial multilevel model with three layers. My model looks like the following:

```
functions {
/*
* Alternative to neg_binomial_2_log_rng() that
* avoids potential numerical problems during warmup
*/
int neg_binomial_2_log_safe_rng(real eta, real phi) {
real gamma_rate = gamma_rng(phi, phi / exp(eta));
if (gamma_rate >= exp(20.79))
return -9;
return poisson_rng(gamma_rate);
}
}
data {
// Define variables in data
// Number of level-1 observations (an integer)
int<lower=0> Ni;
// Number of level-2 clusters
int<lower=0> Nj;
// Number of level-3 clusters
int<lower=0> Nk;
// Numer of models x countries
int<lower = 0> Njk;
// Number of fixed effect parameters
int<lower=0> p;
// Number of random effect parameters
int<lower=0> Npars;
// Variables with fixed coefficient
matrix[Ni,p] fixedEffectVars;
// Cluster IDs
int<lower=1> modelid[Ni];
int<lower=1> countryid[Ni];
// Level 3 look up vector for level 2
int<lower=1> countryLookup[Npars];
int<lower=1> countryMonthLookup[Njk];
// Continuous outcome
int activations[Ni];
}
parameters {
// Define parameters to estimate
// Population intercept (a real number)
real beta_0;
real<lower=0> inv_phi;
// Fixed effects
vector[p] beta;
// Level-1 errors
real<lower=0> sigma_e0;
// Level-2 random effect
real u_0jk[Npars];
real<lower=0> sigma_u0jk;
// Level-3 random effect
real u_0k[Nk];
real<lower=0> sigma_u0k;
}
transformed parameters {
// Varying intercepts
real beta_0jk[Npars];
real beta_0k[Nk];
real phi = inv(inv_phi);
// Varying intercepts definition
// Level-3 (10 level-3 random intercepts)
for (k in 1:Nk) {
beta_0k[k] = beta_0 + u_0k[k];
}
// Level-2 (100 level-2 random intercepts)
for (j in 1:Npars) {
beta_0jk[j] = beta_0k[countryLookup[j]] + u_0jk[j];
}
}
model {
// Prior part of Bayesian inference
// Random effects distribution
u_0k ~ normal(0, sigma_u0k);
u_0jk ~ normal(0, sigma_u0jk);
beta[1] ~ normal(-0.25, 1);
beta[2] ~ normal(0, 1);
beta[3] ~ normal(0, 1);
beta[4] ~ normal(0.25, 1);
beta[5] ~ normal(-0.25, 1);
beta[6] ~ normal(0.25, 1);
sigma_u0jk ~ normal(0,1);
sigma_u0k ~ normal(0,1);
inv_phi ~ normal(0, 1);
// Likelihood part of Bayesian inference
for (i in 1:Ni) {
activations[i] ~ neg_binomial_2_log(beta_0jk[countryMonthLookup[i]] + fixedEffectVars[i,]*beta, phi);
}
}
generated quantities{
real y_rep[Ni];
for (i in 1:Ni) {
real eta_n = beta_0jk[countryMonthLookup[i]] +
fixedEffectVars[i,]*beta;
y_rep[i] = neg_binomial_2_log_safe_rng(eta_n, phi);
}
}
```

When I try to estimate the model the performance (fit of the model in terms of in-sample predicitons) is extremely bad and I get the following error messages: `Exception: normal_lpdf: Scale parameter is 0, but must be > 0!`

at this line

`activations[i] ~ neg_binomial_2_log(beta_0jk[countryMonthLookup[i]] + fixedEffectVars[i,]*beta, phi);`

.

The strange thing is that if I assume a normal distribution for y_{ijt}, so `neg_binomial_2_log(beta_0jk[countryMonthLookup[i]] + fixedEffectVars[i,]*beta, phi);`

replaced by `normal(beta_0jk[countryMonthLookup[i]] + fixedEffectVars[i,]*beta, sigma_e0);`

, the predictions are great and I donâ€™t get any error message.

Does anyone know what is going wrong here? Unfortunately, I cannot post any data for my model, but I checked it and there are no missing values or anything like that.

Any helk is really appreciated!