Hierarchical model with gaussian mixture parameters

I have a hierarchical model with the following likelihood function:

y[i] ~ normal(alpha[i]*x[i]+beta[i], theta[i]*x[i]+gamma[i])

My previous model used normal priors for all the parameters, but now I have one parameter, alpha, which is a gaussian mixture.

The problem is that I can’t manage to write the model properly.

For example, I tried to write the model like this:

data {
    //The model's data
    int<lower=0> N; //number of observations
    int<lower=0> hosts; //number of different hosts
    matrix[hosts,N] x;
    matrix<lower=0>[hosts,N] y;
parameters {
    real<lower=0, upper=1> delta;

    //The raw parameters
    vector[hosts] alpha_raw;
    vector[hosts] beta_raw;
    vector<lower=0>[hosts] theta_raw;
    vector<lower=0>[hosts] gamma_raw;

    //The hyper raw parameters
    ordered[2] hyper_alpha_raw;
    real hyper_beta_raw;
    real<lower=0> hyper_theta_raw;
    real<lower=0> hyper_gamma_raw;

    //The hyper parameters sigma

    vector<lower=0>[2] sigma_alpha;
    real<lower=0> sigma_beta;
    real<lower=0> sigma_theta;
    real<lower=0> sigma_gamma;
transformed parameters {
    //The real parameters
    vector[hosts] alpha;
    vector[hosts] beta;
    vector<lower=0>[hosts] theta;
    vector<lower=0>[hosts] gamma;

//The hyper parameters mu
    ordered[2] mu_alpha;
    real mu_beta;
    real<lower=0> mu_theta;
    real<lower=0> mu_gamma;

    //The priors for the hyper parameters mu
    mu_alpha[1] = 5e-10+ hyper_alpha_raw[1] * 2e-10;
    mu_alpha[2] = 7e-10 + hyper_alpha_raw[2] * 2e-10;
    mu_beta = 1e-05+ hyper_beta_raw * 2e-05;
    mu_theta= 1e-10+ hyper_theta_raw * 2e-10;
    mu_gamma = 1e-05+ hyper_gamma_raw * 2e-05;

    //The priors for the parameters
    beta = mu_beta + beta_raw * sigma_beta;
    theta = mu_theta + theta_raw * sigma_theta;
    gamma = mu_gamma + gamma_raw * sigma_gamma;
model {
    delta ~ beta(5,5);

    //The priors for the raw parameters
    beta_raw ~ normal(0,1);
    theta_raw ~ normal(0,1);
    gamma_raw ~ normal(0,1);

//The priors for the hyper raw parameters
    hyper_alpha_raw ~ normal(0,1);
    hyper_beta_raw ~ normal(0,1);
    hyper_theta_raw ~ normal(0,1);
    hyper_gamma_raw ~ normal(0,1);

    //The priors for the hyper parameters sigma
    sigma_alpha ~ normal(mu_alpha,1e-10);
    sigma_beta ~ normal(mu_beta,1e-05);
    sigma_theta ~ normal(mu_theta,1e-10);
    sigma_gamma ~ normal(mu_gamma,1e-05);

    //The likelihood
    for(i in 1:hosts){
      target += log_mix(delta, normal_lpdf(alpha[i] | mu_alpha[1], sigma_alpha[1]), normal_lpdf(alpha[i] | mu_alpha[2], sigma_alpha[2]));
    for(i in 1:hosts){
      y[i] ~ normal(alpha[i]*x[i]+beta[i], theta[i]*x[i]+gamma[i]);

but when I make the fit, it gives me this error message:

“Exception: normal_lpdf: Random variable is nan, but must not be nan! (in ‘model3a0d7af1978a_424a8dc62d8e4dc5659881bfd96ade10’ at line 78)”

I figure I should give a prior to alpha, like alpha=mu_alpha[1 or 2]+alpha_raw+sigma_alpha[1 and 2] but I don’t know how to write it in stan so that there’s a probability delta to pick mu_alpha[1] and sigma_alpha[1] and 1-delta to pick mu_alpha[2] and sigma_alpha[2].

Or perhaps I’m wrong and this isn’t how I should write that alpha is a gaussian mixture. Could you please help me on this?

The problem is probably that you are never assigning a value to alpha, it is only declared in transformed parameters. If alpha is really a parameter, it should be in parameters, though I would find the model resulting from that a bit weird.

There are many fishy things about the model, (e.g. why do you have the intermediate mu_XXX paramers and why are sigma_XX bound to them? and why with such tight priors?) I am not sure what you want to achieve.

In any case, I would expect you to get a lot of mileage from the brms package which allows for mixture distributions as a response and lets you specify the model as an R formule. The advantage is that brms has well tested and tuned Stan implementations of the models, so you could save yourself some pain struggling with Stan syntax and philosophy. See the examples at https://rdrr.io/cran/brms/man/mixture.html (though they are not completely illuminating)

Hope that helps and best of luck!