Help with specifying a hierarchical poisson - binomial - gaussian mixture

It sounds to me that what you’re describing is a variant of the N-mixture model or near enough to it, but where the binomial success/failure observation is also latent because all you observe is the score of the audio clip. I have some general advice for you, but it’s difficult to say exactly where to go with this without a reproducible example.

There are a few threads on the forum addressing the N-mixture model and some experts that haunt these forums like @jsocolar that may be able to provide you with some additional insight. See:

Also check out the ubms package which has the Royle-Nichols (2003) model as the occuRN function: Bayesian Models for Data from Unmarked Animals using Stan • ubms

I’m seeing a few things in your code that seem to me to be likely issues for sampling. First is the uniform prior on lambda. Uniform priors can cause issues with sampling when the parameter is not declared bounded. There also doesn’t seem to be a theoretical reason for why lambda should only be as high as 15. Second, is that is looks to be me like the order of the elements in the normal mixture should be reversed.
This line:
if (i == 0){target += poisson_lpmf(i | lambda) + normal_lpdf(y[site, j] | mu1, sigma[1]);}
implies that p_det= 1 when there are zero animals, but shouldn’t p_det = 0? (1 - (1 - p)^0 = 0) I think the statement below that should read:

target += poisson_lpmf(i | lambda) + log_mix(p_det[i], normal_lpdf(y[site, j] | mu2, sigma[2]), normal_lpdf(y[site, j] | mu1, sigma[1]))

(Aside: the if/else statement is unnecessary, rewrite the loop to go from 1:N_max and have a separate line for the i==0)

Other than that, it appears to me at first glance as though this should work, but the best way to proceed is to simulate the data generating process first and then fit the model to the simulated data to see if you can recover the parameters. Have you done that?

2 Likes