I am sorry as the model is complicated, I could not post it entirely. I have let the program to give more detailed output as follows. I wish this would help.
Basically, my model has two parameters logb and lognu. They are 3\times 2 matrices and each entries are independently of standard normal distribution. And I have a likelihood function p(x| \log b, \log nu), coded as Modelpdf below. Here x is an ordered pair (n,t) where n\in\{1, 2, 3\} and t>0.
The data, parameters and model block of my code is as follows, where I have set intensive print statement:
data {
int<lower=0> N; // N is number of samples.
matrix[N,2] y;
}
parameters {
matrix[3,2] logb;
matrix[3,2] lognu;
}
model {
print("lpstart=", target());
to_vector(lognu) ~ normal(0,1);
to_vector(logb) ~ normal(0,1);
print("lp = ", target());
print("b = ", exp(logb) );
print("nu = ", exp(lognu) );
for (n in 1:N) {
target += log(Modelpdf(to_int(floor(y[n,1]+0.1)), y[n,2], logb, lognu)) ;
}
print("lp = ", target());
print("b = ", exp(logb) );
print("nu = ", exp(lognu) );
}
I am sure that the Modelpdf is correct as I wrote it in Matlab also and compared the output of the function in Matlab versus the one in stan.
The code I used in rstan to produce the fit object is
test<-list(N=100,y=y)
fit1 <- stan(
file = "model2.stan",
data = test,
algorithm="HMC",
chains = 4,
warmup = 1000,
iter = 2000,
#test_grad = TRUE,
cores = 1,
refresh = 1
)
and the output, given the above print statements in the model block, was
SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 1).
Chain 1: lpstart=0
lp = -20.5882
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -744.816
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1:
Chain 1: Gradient evaluation took 0.29386 seconds
Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 2938.6 seconds.
Chain 1: Adjust your expectations accordingly!
Chain 1:
Chain 1:
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -101664
b = [[0.258684,1.40969e+35],[3.32381e+11,7.5972e+20],[1.58222,2.09183e-16]]
nu = [[0.663955,9.22383e-66],[0.802902,8.30835e-180],[0.660117,1.57487]]
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -102123
b = [[0.0649057,1.23137e+35],[1.05606e+11,2.18032e+19],[2.40351,9.31028e-17]]
nu = [[0.0928038,3.89123e-66],[0.136592,2.49945e-180],[2.4847,4.22241]]
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -6195.87
b = [[0.610048,1.07364e+09],[293.233,33501.6],[3.74232,0.000100333]]
nu = [[0.411694,1.79249e-16],[0.203059,4.90851e-45],[0.84552,8.5348]]
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -347.995
b = [[0.88663,152.109],[2.62581,5.21633],[2.05429,0.0914965]]
nu = [[0.183155,0.000223793],[0.19592,3.19137e-11],[0.86744,2.0867]]
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -17.0888
b = [[1.13499,3.59844],[0.56805,0.465291],[4.95263,0.656462]]
nu = [[0.172349,0.454657],[0.186865,0.00977451],[0.742156,2.50926]]
lp = -303
b = [[1.13499,3.59844],[0.56805,0.465291],[4.95263,0.656462]]
nu = [[0.172349,0.454657],[0.186865,0.00977451],[0.742156,2.50926]]
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -6.96571
b = [[1.1569,1.51236],[0.356334,0.291846],[4.6475,1.20018]]
nu = [[0.131149,2.60687],[0.186482,1.19334],[0.67535,2.32727]]
lp = -181.823
b = [[1.1569,1.51236],[0.356334,0.291846],[4.6475,1.20018]]
nu = [[0.131149,2.60687],[0.186482,1.19334],[0.67535,2.32727]]
Chain 1: Iteration: 1 / 2000 [ 0%] (Warmup)
Chain 1: lpstart=0
lp = -9.56093
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
lp = -733.788
b = [[1.28101,1.11858],[0.342194,0.219518],[4.84288,1.28024]]
nu = [[0.138681,4.57043],[0.187903,6.15098],[0.747808,2.24988]]
Chain 1: lpstart=0
lp = -6.62342
b = [[1.35014,1.58544],[0.355369,0.315736],[4.82317,1.16233]]
nu = [[0.143983,2.4156],[0.191915,1.22097],[0.778794,2.28856]]
lp = -176.541
b = [[1.35014,1.58544],[0.355369,0.315736],[4.82317,1.16233]]
nu = [[0.143983,2.4156],[0.191915,1.22097],[0.778794,2.28856]]
Chain 1: lpstart=0
lp = -6.98545
b = [[1.47061,2.37171],[0.486639,0.504675],[4.77374,0.78277]]
nu = [[0.150132,1.31862],[0.19451,0.207015],[0.811868,2.3221]]
lp = -172.839
b = [[1.47061,2.37171],[0.486639,0.504675],[4.77374,0.78277]]
nu = [[0.150132,1.31862],[0.19451,0.207015],[0.811868,2.3221]]
Chain 1: lpstart=0
lp = -10.7785
b = [[1.67228,2.40545],[0.843087,0.738026],[4.69247,0.550689]]
nu = [[0.157202,0.861084],[0.198997,0.0354028],[0.847158,1.90158]]
lp = -187.928
b = [[1.67228,2.40545],[0.843087,0.738026],[4.69247,0.550689]]
nu = [[0.157202,0.861084],[0.198997,0.0354028],[0.847158,1.90158]]
Chain 1: lpstart=0
lp = -17.9047
b = [[1.94346,1.53232],[1.49993,0.802787],[4.56981,0.410436]]
nu = [[0.165466,0.645804],[0.208855,0.00615597],[0.885124,1.28946]]
lp = -169.085
b = [[1.94346,1.53232],[1.49993,0.802787],[4.56981,0.410436]]
nu = [[0.165466,0.645804],[0.208855,0.00615597],[0.885124,1.28946]]
Chain 1: lpstart=0
lp = -28.7183
b = [[2.2922,0.630646],[2.18052,0.621359],[4.40839,0.314732]]
nu = [[0.175195,0.555789],[0.224462,0.00109276],[0.925936,0.803149]]
lp = -146.783
b = [[2.2922,0.630646],[2.18052,0.621359],[4.40839,0.314732]]
nu = [[0.175195,0.555789],[0.224462,0.00109276],[0.925936,0.803149]]
Chain 1: lpstart=0
lp = -43.3923
b = [[2.71318,0.291384],[3.08589,0.446546],[4.21095,0.257959]]
nu = [[0.186695,0.498598],[0.242963,0.00019923],[0.96985,0.481822]]
lp = -200.806
b = [[2.71318,0.291384],[3.08589,0.446546],[4.21095,0.257959]]
nu = [[0.186695,0.498598],[0.242963,0.00019923],[0.96985,0.481822]]
Chain 1: Iteration: 2 / 2000 [ 0%] (Warmup)
Chain 1: lpstart=0
lp = -43.3923
b = [[2.71318,0.291384],[3.08589,0.446546],[4.21095,0.257959]]
nu = [[0.186695,0.498598],[0.242963,0.00019923],[0.96985,0.481822]]
lp = -200.806
b = [[2.71318,0.291384],[3.08589,0.446546],[4.21095,0.257959]]
nu = [[0.186695,0.498598],[0.242963,0.00019923],[0.96985,0.481822]]
Chain 1: lpstart=0
lp = -2.66669e+07
b = [[4.38545e-24,inf],[2.38016e-44,inf],[2.4329e-152,inf]]
nu = [[2.72374e+66,4.58482e-67],[6.01771e+71,inf],[3.18612e+20,9.90321e-195]]
SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 2).
Chain 2: lpstart=0
lp = -15.3714
b = [[0.256505,0.268295],[1.0411,0.493499],[0.743356,0.619902]]
nu = [[0.309038,1.76606],[4.21477,0.772639],[1.94766,0.924812]]
lp = -452.584
b = [[0.256505,0.268295],[1.0411,0.493499],[0.743356,0.619902]]
nu = [[0.309038,1.76606],[4.21477,0.772639],[1.94766,0.924812]]
Chain 2: lpstart=0
lp = -4.34415
b = [[0.256505,0.268295],[1.0411,0.493499],[0.743356,0.619902]]
nu = [[0.309038,1.76606],[4.21477,0.772639],[1.94766,0.924812]]
lp = -441.557
b = [[0.256505,0.268295],[1.0411,0.493499],[0.743356,0.619902]]
nu = [[0.309038,1.76606],[4.21477,0.772639],[1.94766,0.924812]]
Chain 2:
Chain 2: Gradient evaluation took 0.316785 seconds
Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 3167.85 seconds.
Chain 2: Adjust your expectations accordingly!
Chain 2:
Chain 2:
Chain 2: lpstart=0
lp = -4.34415
b = [[0.256505,0.268295],[1.0411,0.493499],[0.743356,0.619902]]
nu = [[0.309038,1.76606],[4.21477,0.772639],[1.94766,0.924812]]
lp = -441.557
b = [[0.256505,0.268295],[1.0411,0.493499],[0.743356,0.619902]]
nu = [[0.309038,1.76606],[4.21477,0.772639],[1.94766,0.924812]]
Chain 2: lpstart=0
lp = -20968.8
b = [[1.05277e+17,1.71171e+11],[2.31555e+44,2436.64],[1.10887e+29,2.96038e+12]]
nu = [[0.00210357,4.01181e-17],[2.57692e-59,0.00234713],[3.53367e-28,1.66718e-07]]
SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 3).
Chain 3: lpstart=0
lp = -21.123
b = [[0.22941,0.226122],[0.899819,6.68798],[2.61169,1.753]]
nu = [[1.32054,1.42598],[6.61398,4.62156],[6.84219,2.91001]]
lp = -1009.9
b = [[0.22941,0.226122],[0.899819,6.68798],[2.61169,1.753]]
nu = [[1.32054,1.42598],[6.61398,4.62156],[6.84219,2.91001]]
Chain 3: lpstart=0
lp = -10.0957
b = [[0.22941,0.226122],[0.899819,6.68798],[2.61169,1.753]]
nu = [[1.32054,1.42598],[6.61398,4.62156],[6.84219,2.91001]]
lp = -998.877
b = [[0.22941,0.226122],[0.899819,6.68798],[2.61169,1.753]]
nu = [[1.32054,1.42598],[6.61398,4.62156],[6.84219,2.91001]]
Chain 3:
Chain 3: Gradient evaluation took 0.305883 seconds
Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 3058.83 seconds.
Chain 3: Adjust your expectations accordingly!
Chain 3:
Chain 3:
Chain 3: lpstart=0
lp = -10.0957
b = [[0.22941,0.226122],[0.899819,6.68798],[2.61169,1.753]]
nu = [[1.32054,1.42598],[6.61398,4.62156],[6.84219,2.91001]]
lp = -998.877
b = [[0.22941,0.226122],[0.899819,6.68798],[2.61169,1.753]]
nu = [[1.32054,1.42598],[6.61398,4.62156],[6.84219,2.91001]]
Chain 3: lpstart=0
lp = -414028
b = [[3.60783e+17,0.333854],[5.96048e+104,0.736455],[3.84746e+105,1.62398e+25]]
nu = [[0.000421326,0.0001302],[0,0.0761439],[1.2502e-160,7.40588e-25]]
SAMPLING FOR MODEL 'anon_model' NOW (CHAIN 4).
Chain 4: lpstart=0
lp = -19.9926
b = [[1.25489,5.86404],[6.19436,0.831945],[0.284353,2.42538]]
nu = [[5.13431,1.48601],[3.02993,0.404332],[0.15197,2.15936]]
lp = -1453.3
b = [[1.25489,5.86404],[6.19436,0.831945],[0.284353,2.42538]]
nu = [[5.13431,1.48601],[3.02993,0.404332],[0.15197,2.15936]]
Chain 4: lpstart=0
lp = -8.96531
b = [[1.25489,5.86404],[6.19436,0.831945],[0.284353,2.42538]]
nu = [[5.13431,1.48601],[3.02993,0.404332],[0.15197,2.15936]]
lp = -1442.27
b = [[1.25489,5.86404],[6.19436,0.831945],[0.284353,2.42538]]
nu = [[5.13431,1.48601],[3.02993,0.404332],[0.15197,2.15936]]
Chain 4:
Chain 4: Gradient evaluation took 0.240945 seconds
Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 2409.45 seconds.
Chain 4: Adjust your expectations accordingly!
Chain 4:
Chain 4:
Chain 4: lpstart=0
lp = -8.96531
b = [[1.25489,5.86404],[6.19436,0.831945],[0.284353,2.42538]]
nu = [[5.13431,1.48601],[3.02993,0.404332],[0.15197,2.15936]]
lp = -1442.27
b = [[1.25489,5.86404],[6.19436,0.831945],[0.284353,2.42538]]
nu = [[5.13431,1.48601],[3.02993,0.404332],[0.15197,2.15936]]
Chain 4: lpstart=0
lp = -1.08691e+06
b = [[1.19098e+63,3.07881],[0,8.71634e+07],[9.24704e+19,668397]]
nu = [[1.62845e-148,9.38263],[9.98345e+140,0.0108514],[0.00685156,0.000196538]]
Here the data y was generated by simulation with ground truth \log b= \log \begin{pmatrix}
1& 1.1 \\ 1.2 & 1.32\\ 1.3 & 1.43
\end{pmatrix} and \log nu (i,j)=0 (i.e. nu(i,j)= 1 for all i,j). Hence in the last output of each chain, the b and nu are really ridiculously large or small. This might be the reason why it terminated unexpectedly. But how to solve this?