Hi all,
I’ve been here a lot lately, so I’m very sorry about that!
Thanks to the community, I was able to get my model running, but noticed something strange in the output… So, I was running 4 chains with 2000 iterations. I noticed that chains 2, 3, and 4 were running in synchrony, and chain 1 was lagging behind. When the model finished running, I was told the following:
Warning message:
In writable_sample_file(diagnostic_file) :Warning message:
Warning message:
1: There were 8 divergent transitions after warmup. Increasing adapt_delta above 0.8 may help. See
http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup
"/Users/mjarviscross/Desktop/GitHub/FcnalRespProj/data/Type II/FcnalRespProj" is not writable; use "/var/folders/l_/xb7lfbls12nb0dwxdrww8gn40000gn/T//RtmpDSeoCR" instead
In writable_sample_file(diagnostic_file) :In writable_sample_file(diagnostic_file) :Warning message:
In writable_sample_file(diagnostic_file) :2: There were 3963 transitions after warmup that exceeded the maximum treedepth. Increase max_treedepth above 10. See
http://mc-stan.org/misc/warnings.html#maximum-treedepth-exceeded
"/Users/mjarviscross/Desktop/GitHub/FcnalRespProj/data/Type II/FcnalRespProj" is not writable; use "/var/folders/l_/xb7lfbls12nb0dwxdrww8gn40000gn/T//RtmpIJCP4C" instead
"/Users/mjarviscross/Desktop/GitHub/FcnalRespProj/data/Type II/FcnalRespProj" is not writable; use "/var/folders/l_/xb7lfbls12nb0dwxdrww8gn40000gn/T//Rtmp7jEFf6" instead
3: There were 1 chains where the estimated Bayesian Fraction of Missing Information was low. See
http://mc-stan.org/misc/warnings.html#bfmi-low
4: Examine the pairs() plot to diagnose sampling problems
"/Users/mjarviscross/Desktop/GitHub/FcnalRespProj/data/Type II/FcnalRespProj" is not writable; use "/var/folders/l_/xb7lfbls12nb0dwxdrww8gn40000gn/T//RtmphaWosk" instead
5: The largest R-hat is 5.43, indicating chains have not mixed.
Running the chains for more iterations may help. See
http://mc-stan.org/misc/warnings.html#r-hat
6: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable.
Running the chains for more iterations may help. See
http://mc-stan.org/misc/warnings.html#bulk-ess
7: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable.
Running the chains for more iterations may help. See
http://mc-stan.org/misc/warnings.html#tail-ess
And when I looked at the trace plots, I noticed that only the first chain seemed to be sampling properly:
This seems like a buggy problem rather than a conceptual problem, so I was wondering if anyone’s seen something like this before, and if anyone knows how to amend the situation.
Here’s my model (in rstan):
library(rstan)
library(gdata)
library(bayesplot)
library(tidyverse)
library(brms)
write("// Stan Model;
functions{
real[] dZ_dt(
real t, // Time
real[] Z, // System state {Parasite, Host}
real[] theta, // Parms
real[] x_r, // Real data; below is integer data
int[] x_i){
real P = Z[1]; // System state coded as an array, such that Z = (P,H)
real H = Z[2];
real r = theta[1]; // Parameters of the system, in order they appear
real O = theta[2];
real h = theta[3];
real b = theta[4];
real c = theta[5];
real u = theta[6];
real dP_dt = P*r - H*(O*P/1 + O*P*h); // Mechanistic model
real dH_dt = b + H*(c*(O*P/1 + O*P*h)-u);
return({dP_dt,dH_dt}); // Return the system state
}
}
data{
int<lower=0>N; // Define N as non-negative integer
real ts[N]; // Assigns time points to N
int y_init[2]; // Initial conditions for ODE
int<lower=0>y[N,2]; // Define y as real and non-negative
}
parameters{
real<lower=0>r;
real<lower=0,upper=1>O;
real<lower=0>h;
real<lower=0>b;
real<lower=0>c;
real<lower=0,upper=1>u;
real<lower=0>Z_init[2]; // Initial population size non-neg
}
transformed parameters{
// Stiff solver for ODE
real Z[N,2]
= integrate_ode_bdf(dZ_dt,Z_init,0,ts,{r, O, h, b, c, u},rep_array(0.0,2),rep_array(0,2),1e-10,1e-10,2e4);
for (i in 1:N) {
Z[i,1] = Z[i,1] + 1e-6;
Z[i,2] = Z[i,2] + 1e-6;
}
}
model{ // Ignore the means/variances for now...
r~normal(2.5,1); // r
O~beta(2,2); // O; bounded between 0 and 1
h~normal(0.06,1); // h
b~normal(35,1); // b
c~normal(0.2,1); // c
u~beta(2,2); // u; bounded between 0 and 1
Z_init~uniform(0,400);
for (k in 1:2){
y_init[k]~poisson(Z_init[k]);
y[ ,k]~poisson(Z[ ,k]);
}
}",
"Stan_Model_TypeII.stan")
stanc("Stan_Model_TypeII.stan") # To check that we wrote a file (I think we did?)
Stan_Model_TypeII <- stan_model("Stan_Model_TypeII.stan")
# Squeezing the data into a form that Stan gets
Stoch_Data_TypeII = read.csv('/Users/mjarviscross/Desktop/GitHub/FcnalRespProj/data/Type II/FcnalRespProj_Data/Stoch_Data_TypeII.csv')
N <- length(Stoch_Data_TypeII$t)-1 # df is 2923; N is 2922
ts <- 1:N
y_init <- c(10,350) # Initial states, P = 10; H = 350
y <- as.matrix(Stoch_Data_TypeII[2:(N+1),3:4])
y <- cbind(y[ ,2],y[ ,1]); # This worked, sick; where y[,1] is H, and y[,2] is P
Stan_StochData_TypeII <- list(N=N,ts=ts,y_init=y_init,y=y)
# // Make non-neg real<lower=0>alpha[{1,3,4,5}];
# // Proportions bounded between 0 and 1 real<lower=0,upper=1>alpha[{2,6}]
# Fitting the data to the model
Stan_Model_TypeII <- stan_model("Stan_Model_TypeII.stan")
fit2 <- sampling(Stan_Model_TypeII,
data = Stan_StochData_TypeII,
warmup = 1000,
iter = 2000,
chains = 4,
cores = 4,
thin = 1,
check_data = TRUE,
diagnostic_file = "/Users/mjarviscross/Desktop/GitHub/FcnalRespProj/data/Type II/FcnalRespProj/TypeII_Fitting_Output2.R",
seed = 124,
show_messages = TRUE,
verbose = TRUE)
#### Tracking progress ####
# 1st run (1 chain, 500 iter)
#### 806.348 seconds (Warm-up)--13.5 min
#### 2514.75 seconds (Sampling)--42 min
#### Rhat 1.71
#### 233 transitions after warmup that exceeded the maximum treedepth; increase max_treedepth above 10
#### Bulk Effective Samples Size (ESS) is too low
OutputSum_1 <- fit
fit1_plot <- plot(fit)
fit1_trace <- traceplot(fit)
# 2nd run (4 chains, 2000 iter)
#### Chain 1, 173 min warm/357 min samp; Chain 2, 92 min warm/230 min samp; Chain 3, 88 min warm/242 min samp; Chain 4, 87 min warm/253 min samp
#### Rhat 5.43
#### 8 divergent transitions after warmup; ncreasing adapt_delta above 0.8 may help
#### 3963 transitions after warmup that exceeded the maximum treedepth; increase max_treedepth above 10
#### Bulk Effective Samples Size (ESS) is too low; tail Effective Samples Size (ESS) is too low--more iterations!
#### 1 chains where the estimated Bayesian Fraction of Missing Information was low
fit2_plot <- plot(fit2)
fit2_trace <- traceplot(fit2)
I would appreciate any advice people have! Thank you all so much for putting up with me! :)