Maybe you could unpack that notation. What is the three-argument “Bin” function doing and is “nBin” a different function? If it’s negative binomial, which of the bajillion parameterizations is “nBin” assuming? Is v_i^{+} etc. some kind of array or function?
This is going to be a problem. The exp() is likely to underflow. It’s better to do the following, which won’t underflow.
Also, “poisson” is misspelled and I’d highly recommend _ for separating words.
You also don’t need to set l[n][z] = 0 before resetting it.
Fudging things to positive with +0.00001 is almost never a good idea. If you’re having problems where things underflow, there are other issues with your model that this kind of thing won’t solve.
Thank you for your response; it has been quite beneficial. However, I’ve noticed that the computation of l[n] is quite time-consuming, is there any algorithm to accelerate it? In the context of negative binomial and binomial distributions, the parameter z serves as an auxiliary variable that is used to discrete the likelihood function. The terms v_i^+ , v_i and Z_i represent the data, \zeta_i represents the model output, while ψ denotes the overdispersion parameter."
I don’t see a way to make this a lot faster. The best I can see to do is to not repeat calculations like sqrt(m_ILIw[n] * rho_a + (m_ILIw[n] * rho_a)^2/phi_ILI) because they’re relatively expensive. You can also vectorize some of these operations,
and then access foo[n]. It should be a bit faster in vectorized form.’
then you have terms like log_cdf_upper than can be pulled up out of the z loop because they don’t depend on z.
Hacks like z * 0.9999 are almost always a bad idea if they were introduced to patch a problem with a model hitting a boundary. You want to use a prior or a better model and avoid boundaries of parameter space in estimation.
Thank you for your patience! I have rescaled the size of Z_ILI[n], and now the code runs much faster, just as I had hoped. Additionally, your suggestion on vectorization was incredibly insightful and helpful—many thanks!