Hi,

I’m trying to model a generative process where I have 2 independent random variables, such that:

X_1 \sim Poisson(\lambda_1)

X_2 \sim Poisson(\lambda_2)

Y = a_1X_1 + a_2X_2

I would like to create a model to estimate \lambda_1 and \lambda_2 from observed outcome Y and observed vectors of data a_1 and a_2.

R code to simulate the generative process would look like this:

```
library(rstan)
set.seed(12345)
N <- 2000 #the number of simulated samples
a1 <- rnorm(N) #vectors of real numbers
a2 <- rnorm(N)
lambda1 <- 3 #The parameters we will try to recover
lambda2 <- 10
y <- (a1 * rpois(N, lambda1)) + (a2 * rpois(N, lambda2)) #create simulated y, a linear combination of random draws from these two independent Poisson distributions
```

In short, I’m a little confused where to start. I don’t think this is a mixture model (log_mix()…), as a mixture would have either one of the two distribution for each observation—not a combination of both.

I *think* solving this problem would involve deriving a new PDF for a new “linear combination of poissons” distribution and specifying that as a new “user defined” PDF in Stan?

Does this sound correct, or is there something I’m missing here? With sufficient data I think \lambda_1 and \lambda_2 should be estimable, but I’m not sure there exists a straightforward way of implementing the correct “linear combination” model…

(Still new to Stan)