I’m trying to fit a zero-truncated poisson model with about 200,000 data points, and the model fitting is really slow, on the order of about an hour. This was surprising to me because when I fit a lognormal model with similar amounts of data, it took less than a minute.

Is there anything I can do to speed this process up? I can’t share my data, but I will share Python code to simulate the problem.

```
from scipy.stats import poisson
import pystan
l = 1.5
poisson_draws = poisson.rvs(l, size=150000)
poisson_draws = poisson_draws[poisson_draws > 0]
stan_data = stan_data = {
"n": len(poisson_draws),
"y": poisson_draws,
}
fit = model.sampling(data=stan_data, iter=4000, warmup=1000)
```

```
data {
int<lower=1> n;
int<lower=1> y[n];
}
parameters {
real<lower=0> lambda;
}
model {
lambda ~ gamma(20, 20);
for (a in 1:n) {
y[a] ~ poisson(lambda) T[1,]; // truncated poisson
}
}
generated quantities {
real zero_truncated_mean;
zero_truncated_mean_a = (lambda * exp(lambda)) / (exp(lambda) - 1);
}
```