Now, I try to evaluate whether estimates tends to true parameter as sample size tends to infinity.

For a give true parameter \theta^{\text{true}}, I generated datasets D_k,k=1,...K whose sample size is N_k. Using these datasets, I calculated estimates \theta(D_k),k=1,...,K for each datatset. Then I calculates the error of estimates:

\theta(D_k) - \theta^{\text{true}}

which is shown in the following:

The x-axis denotes sample size N_k and the y-axis is the above error of estimates.

The error decreases monotonically with respect to sample size N_k.

However, in the most large sample size N_k, the error does not decrease.

It is difficult to determine whether it causes a model or a numeric digit.

My program is so complicated, but I guess it cause the issues such as floating point number ?

The K is passed to a Stan file as a number of Bernoulli trials, namely `~ Binomial( , N_k)`

.

What is the maximal number which can be passed to the number of Bernoulli trials in the model block of Stan file? Is it correct when the trial number is large, such as `~ Binomial( success rate , 111111111)`

.

I want to hear any opinion.