How do I code an offset into my Poisson GLM?

I have an insurance data set to predict frequency of claims. Using the python statsmodel package, I set the risk units (named eunits) as an offset in the model object. However, using the stan language, I’m not sure how to code it. Do I just subtract the eunit on the end like the below model section?

liability_code = """
data {
    int<lower=0> N;
    vector[N] driver_age;
    vector[N] BIlmt;
    int<lower=0> y1[N];
}
parameters {            
    real driver_age_coeff; //parameters is what we want to infer....in this case is the coefficient for driver_age
    real BIlmt_coeff;      //parameters is what we want to infer....in this case is the coefficient for limit 
    real intercept;        // parameter for mean or intercept   
}
model {
    driver_age_coeff ~ normal(0, 1);  
   BIlmt_coeff ~ norm(0, 1);
    intercept ~ normal(0,5);
    y1 ~ poisson(driver_age_coeff*driver_age + BIlmt_coeff*BIlmt + intercept - eunit);   
}
"""
1 Like

should be normal.

If you subtract eunit, eg. - eunit you set a negative offset. So + eunit should be the right thing.

Thank you. And if I wanted to code a specific coefficient as an offset in the model (to account for a discount) would the correct way be

liability_code = """
data {
    int<lower=0> N;
    vector[N] driver_age;
    vector[N] BIlmt;
    vector[N] discount;
    int<lower=0> y1[N];
}
parameters {            
    real driver_age_coeff; //parameters is what we want to infer....in this case is the coefficient for driver_age
    real BIlmt_coeff;      //parameters is what we want to infer....in this case is the coefficient for limit 
    real intercept;        // parameter for mean or intercept   
}
model {
    driver_age_coeff ~ normal(0, 1);  
   BIlmt_coeff ~ norm(0, 1);
    intercept ~ normal(0,5);
    y1 ~ poisson(driver_age_coeff*driver_age + BIlmt_coeff*BIlmt + discount*.9 +  intercept + eunit);   
}
"""

Even better, you could use std_normal() which is slightly more efficient.

yes thank you. Is the discount coded correctly?