Memory error in Rmarkdown when using Rstan

Recently I am running several quite long Stan programs and I saved the output of each Stan fit as Rdata. However, when I tried to load every saved Rdata into R it worked well but when I tried to knit the Rmarkdown file the memory crashed.

May I ask that is there any general advice for how to knit the Rmarkdown file when you are running several quite long Stan programs and produced very huge outputs (eg, 4GB for each Stan output and we have 8 such outputs)

Use the cache = TRUE option (and a tag) in the R chunk header rather than trying to save and load them yourself. But, if you are going to save and reload, it is better to save CSV files (see the sample_file argument) and load them with read_stan_csv.