Accessing Stan Object After Memory Overload

After running a somewhat complex stan model that takes a long time to run >2 days I recieve an error
Error: cannot allocate vector of size 250 Kb

and when I call the model I get the message

Stan model 'exemplar_validate' does not contain samples.

Is there any way to salvage the results of the model fitting? It ran through all the iterations but was not saved in the fit object.

Would be really a bummer to lose all that data. Are there any suggestions for avoiding this kind of problem in the future? Thanks so much

If you think you might run into memory problems, it is a good idea to write the draws out to a CSV file as they go. This is done via the sample_file argument to sampling and stan in RStan and PyStan. However, there is a good chance you will run out of RAM reading those draws back into an interactive session. Often in situations like this, people are defining things in transformed parameters that they do not really need to store draws of and would be better off defining them in the model block. Or they are defining a bunch of stuff in the generated quantities block that would be better off with a standalone generated quantities file. Or they are not making use of the pars and include arguments for things they they are not interested in making inferences about beyond perhaps glancing at the posterior means.

1 Like

I tried his answer with the following code to create the file in Desktop.

sample_file =paste0(file.path(Sys.getenv("USERPROFILE"),"Desktop"),"\\samples"),

But the resulting file is difficult to understand for me.