I have noticed that my model for linear regression gives very accurate fitted values, except when they are really small. Is this loss of precision normal? How can I get an appropriate fit with small values?
I use the following model:
data {
int<lower=0> N;
vector[N] x;
vector[N] y;
}
parameters {
real intercept;
real first_degree;
real<lower=0> sigma;
}
model {
intercept ~ normal(7.7e-07,7.7e-08)
first_degree ~ normal(6.9e-11,6.9e-12)
sigma ~ normal(1.9e-12,1.9e-13)
y ~ normal(first_degree*x+intercept, sigma);
}
with the following data:
intercept=7.661272e-07
first_degree=6.862693e-11
sigma=1.895320e-12
x = sample(0:500,500,replace=T)
y = first_degree * x + intercept + rnorm(N, sd=sigma)
df = data.frame(x=x,y=y)