Help implementing Marginalized Distribution


Hello Everyone,

I am implementing a marginalized poisson-gamma hierarchical model in pystan and I never get to initialize it properly. Any help will be appreciated.



       n ~ poisson(p)
       c ~ gamma(n, m)


      c ~ exp(-p)       if c==0
      c ~ sqrt(p/m*c) * exp(-c/m -p) * BessI_1(2*sqrt(c*p/m))      if c>0

Stan Code:

model_1 = “”“
functions {
real p_lpdf(real y, real p, real m){
if (y == 0) return (-p);
else {
return (0.5 * log(p / (y * m)) - y/m - p + log(bessel_first_kind(1, 2 * sqrt(y * p / m))));}
data {
int<lower=0> N;
vector[N] y;
parameters {
real<lower=0.1> lam;
real<lower=0.1> alpha;
model {
for (n in 1:N){
target += (p_lpdf(y[n]| lam, alpha));

pooled_data_dict = {‘N’: 1000, ‘y’: y}
pooled_fit = pystan.stan(model_code=model_1,data=pooled_data_dict, iter=1000, chains=1)


I don’t see, how they are connected, if you say n ~ poisson( c ), then n is distr. negative binomial.


I have just modified the post to represent what I meant. The actual model is:


n ~ poisson(p)
c ~ gamma(n, m)

Thanks for pointing out to my mistake.


Could you provide the derivation of the math behind? Some paper?


What is the story behind having a Poisson produce an integer parameter for a gamma?

Your model could run into problems if n == 0.

You can’t implement your model in Stan unless you can marginalize out n, i.e.,

p(c | m, p) = SUM_n poisson(n | p) * gamma(c | n, m)

Sometimes you can get away with restricting n to a range that’s tractable if p is small and all the probability mass lies near p.

Alternatively, you can use a continuous approximation

n ~ ... some positive continuous distribution
        with mean p and variance p ...