I tag people I saw have answered similar topics before @bbbales2 , @paul.buerkner
Let’s suppose I have negative-binomial data of this kind, from three operators
I would like to find the grand-mean posterior predictive distribution.
Let’s start from the wrong approach I would like to improve
As for comparison, this is the wrong way to do it, ignoring the categories
brm( bright ~ 1 , data = input_dat, family = negbinomial())
Posterior predictive distribution
Let’s try to do things right
brm( bright ~ (1 | operator), data = input_dat, family = negbinomial())
Family: negbinomial
Links: mu = log; shape = identity
Formula: bright ~ (1 | operator)
Data: input_dat (Number of observations: 900)
Draws: 4 chains, each with iter = 2000; warmup = 1000; thin = 1;
total post-warmup draws = 4000
Group-Level Effects:
~operator (Number of levels: 3)
Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
sd(Intercept) 1.18 0.59 0.45 2.74 1.01 675 820
Population-Level Effects:
Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
Intercept 3.89 0.54 2.82 4.99 1.01 767 976
Family Specific Parameters:
Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
shape 29.17 2.17 25.08 33.76 1.00 1586 1733
Draws were sampled using sampling(NUTS). For each parameter, Bulk_ESS
and Tail_ESS are effective sample size measures, and Rhat is the potential
scale reduction factor on split chains (at convergence, Rhat = 1).
My question is how do I generate the negative binomial distribution of the grand-mean?
(pseudocode)
rnbinom(exp(rnorm(Intercept, sd_Intercept )), shape =shape )
rnbinom(exp(Intercept), shape =shape )
I would say (1) makes sense, but I get way to much overdispersion
Thanks!