Hi all,

I have reaction time data, and comparing models with different distributions suggests that the weibull (with the identity link) is the best fit for my data. The data has a multi-level structure (different participants and different conditions), but I can only get relatively simple models to fit. The reason being is that the sampler has trouble finding initial values for the scale parameter and throws the following error multiple times:

Chain 1: Rejecting initial value:

Chain 1: Error evaluating the log probability at the initial value.

Chain 1: Exception: weibull_lpdf: Scale parameter[1] is -1.16905, but must be > 0! (in ‘model2124403820f7_fedaec10e52be962e84af56da16ca07c’ at line 40)

For simple models it manages to initialise eventually, but as the number of parameters increase it becomes more problematic.

The following is the stan code from `brms::make_stancode`

for a very simple `y ~ x`

model.

```
functions {
}
data {
int<lower=1> N; // number of observations
vector[N] Y; // response variable
int<lower=1> K; // number of population-level effects
matrix[N, K] X; // population-level design matrix
int prior_only; // should the likelihood be ignored?
}
transformed data {
int Kc = K - 1;
matrix[N, Kc] Xc; // centered version of X
vector[Kc] means_X; // column means of X before centering
for (i in 2:K) {
means_X[i - 1] = mean(X[, i]);
Xc[, i - 1] = X[, i] - means_X[i - 1];
}
}
parameters {
vector[Kc] b; // population-level effects
real temp_Intercept; // temporary intercept
real<lower=0> shape; // shape parameter
}
transformed parameters {
}
model {
vector[N] mu = temp_Intercept + Xc * b;
for (n in 1:N) {
mu[n] = (mu[n]) / tgamma(1 + 1 / shape);
}
// priors including all constants
target += student_t_lpdf(temp_Intercept | 3, 3, 10);
target += gamma_lpdf(shape | 0.01, 0.01);
// likelihood including all constants
if (!prior_only) {
target += weibull_lpdf(Y | shape, mu);
}
}
generated quantities {
// actual population-level intercept
real b_Intercept = temp_Intercept - dot_product(means_X, b);
}
```

I can see that the issue is in the line `mu[n] = (mu[n]) / tgamma(1 + 1 / shape);`

, which then gets used for the scale parameter when calculating the log likelihood. Without `exp(mu[n])`

(which is present when I use a log link) the scale parameter can be negative, hence the error. Which makes me wonder whether it is sensible to use the identity link with the weibull at all?

I’m relatively new to stan so I’m not sure what the best way to solve this is. I’ve tried setting priors that constrain the intercept and coefficient to be positive, and also to set initial values on my parameters (as per the advice for similar issues to mine), but this doesn’t appear to solve the issue. Perhaps I also need to turn centering off (as well as sensible priors and initial values), but when I try to do that it either runs super slow or doesn’t converge.

Any advice appreciated.