Hi,
I am running a Stan code with rstan. When I tested the program and set iter = 6000, warmup = 3000 the program could work properly. Then I would like to see whether the result will change if I increased the number of iterations so I set iter = 15000, warmup = 5000 but this time the following error message showed up:
Error: cannot allocate vector of size 5.6 Gb
11.
unlist(sss2, use.names = FALSE)
10.
.local(object, …)
9.
extract(x, permuted = FALSE, inc_warmup = FALSE, …)
8.
extract(x, permuted = FALSE, inc_warmup = FALSE, …)
7.
as.array.stanfit(object)
6.
as.array(object)
5.
throw_sampler_warnings(nfits)
4.
.local(object, …)
3.
sampling(sm, data, pars, chains, iter, warmup, thin, seed, init, check_data = TRUE, sample_file = sample_file, diagnostic_file = diagnostic_file, verbose = verbose, algorithm = match.arg(algorithm), control = control, check_unknown_args = FALSE, cores = cores, open_progress = open_progress, …
2.
sampling(sm, data, pars, chains, iter, warmup, thin, seed, init, check_data = TRUE, sample_file = sample_file, diagnostic_file = diagnostic_file, verbose = verbose, algorithm = match.arg(algorithm), control = control, check_unknown_args = FALSE, cores = cores, open_progress = open_progress, …
1.
stan(file = “simulation.stan”, data = time_data, refresh = 0, cores = 8, iter = 15000, warmup = 5000)
I am wondering whether this is a problem for my laptop?