hello all, I am trying to fit a stochastic frontier model (not much different from linear regression) for a balanced panel data. I am allowing the inefficiency (u) to be correlated with the previous lag. I have tried the following code but I am getting very few neff samples and divergence.

Could someone please help?

data {

int<lower=1> N;// Number of observations

int<lower=1> J; // Number of groups

int<lower=1> P; // Number of predictors i

int<lower=1> T; // Number of time period

real Y[N]; // dependent variables

matrix[N,P] X; // matrix of independent variables☺

int<lower=1,upper=J> dhb_id[N];

int<lower=1,upper=T> TIME[N];

}

parameters {

real alpha;

real delta;

vector[P] beta;

real<lower=0,upper=1> rho;

real<lower=0> sigma;

real<lower=0> phi;

vector [N]u;

}

transformed parameters {

}

model {

alpha ~ normal(0,1);

beta ~ normal (0,1);

sigma ~ gamma(1,1);

delta ~ normal (0,1);

phi ~ gamma(1,1);

rho ~ beta(3,4);

for(n in 1:N) {

if(TIME[n]==1) {

u[n] ~ normal( delta/(1-rho), phi/(1-rho * rho)); /// impose stationary and intilize the model

}

else

u[n] ~ normal(delta + rho * u[n-1], phi);

}

Y ~ normal(alpha+ X*beta- exp(u) , sigma);

}

Please if someone could help me, I will be very gratefull.