Strange behavior of BRMS during meta-analysis

I’m performing a simple meta-analysis with brms following some suggestions (here and here ). This is my dataset:

> meta_data %>% 
+   select(study, eff_size, eff_size_variance_pool)
# A tibble: 16 x 3
   study eff_size eff_size_variance_pool
   <dbl>    <dbl>                  <dbl>
 1     1    1.52                  0.307 
 2     1    1.23                  0.189 
 3     1    1.05                  0.131 
 4     1    0.99                  0.166 
 5     2    0.543                 0.0478
 6     2    0.714                 0.0589
 7     3    1.74                  0.0495
 8     4    1.15                  0.0737
 9     4    1.7                   0.129 
10     5    0.997                 0.0488
11     5    1.10                  0.0408
12     6    0.568                 0.0774
13     6    0.786                 0.0818
14     7    0.369                 0.0325
15     7    0.369                 0.0325
16     9    0.947                 0.0154

This is my model with brms:

brm_fit <- brm(
  eff_size | se(eff_size_variance_pool) ~ 1 + (1 | study), 
  data = meta_data,
)

The model works but diagnostic is bad and strange. In particular Rhat and ESS seems good but the warnings highlight a too low EES and NA Rhat.

Warning messages:
1: There were 47 divergent transitions after warmup. Increasing adapt_delta above 0.8 may help. See
http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup 
2: Examine the pairs() plot to diagnose sampling problems
3: The largest R-hat is NA, indicating chains have not mixed.
Running the chains for more iterations may help. See
http://mc-stan.org/misc/warnings.html#r-hat 
4: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable.
Running the chains for more iterations may help. See
http://mc-stan.org/misc/warnings.html#bulk-ess 
5: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable.
Running the chains for more iterations may help. See
http://mc-stan.org/misc/warnings.html#tail-ess 

This is the model summary:

> summary(brm_fit)
 Family: gaussian 
  Links: mu = identity; sigma = identity 
Formula: eff_size | se(eff_size_variance_pool) ~ 1 + (1 | study) 
   Data: meta_data (Number of observations: 16) 
Samples: 4 chains, each with iter = 10000; warmup = 5000; thin = 1;
         total post-warmup samples = 20000

Group-Level Effects: 
~study (Number of levels: 8) 
              Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
sd(Intercept)     0.53      0.18     0.29     1.02 1.00     2319     1502

Population-Level Effects: 
          Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
Intercept     0.96      0.20     0.56     1.34 1.00     1813     1299

Family Specific Parameters: 
      Estimate Est.Error l-95% CI u-95% CI Rhat Bulk_ESS Tail_ESS
sigma     0.00      0.00     0.00     0.00 1.00    20000    20000

Samples were drawn using sampling(NUTS). For each parameter, Bulk_ESS
and Tail_ESS are effective sample size measures, and Rhat is the potential
scale reduction factor on split chains (at convergence, Rhat = 1).
Warning message:
There were 47 divergent transitions after warmup. Increasing adapt_delta above 0.8 may help. See http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup 

I’ve noticed that the model is trying to fit a family-specific parameter that is absent in linked examples. Maybe the fitting problems are related to this parameter? Does the model is correctly specified?

1 Like

PS: I’ve also reproduced the example from Solomon Kurtz (here the link) and the result is similar in terms of family-specific parameter estimation and warning messages (absent in the original post).

Hi @filippogambarota

Could you share the data in a way that’s easier to load into R so that I can look into whats going on? Right now I’d have to manually type in all those values.

It should do se(..., sigma = FALSE) by default but apparently not. Try setting it manually to FALSE. What version of brms & R are you on?

Hi @matti, Thanks for your response. I’ve just tried to set the value to FALSE (however it should be the default value) but the results is the same. Setting it to TRUEgive me the estimation of the family specific parameter with the same error message.
However, I’ve tried the developmental version of brms (2.13.9) and the error disappears (but the family specific parameter is still in the model summary even with sigma = FALSE).
My R version is the 4.0.2.

PS I’ve uploaded the datasetreprex_dati.csv (1.1 KB)

What is the estimated sigma, then? If that’s the case I’d encourage you to open an issue on the brms github page, because the sigma = FALSE argument doesn’t seem to work.

Sigma is fixed to zero (and appears in the output as fixed to zero). So no issue here.

1 Like