Error in unserialize(socklist[[n]]) : error reading from connection Error in serialize(data, node$con, xdr = FALSE) : error writing to connection

Hello, I have a problem that I have not encountered before. I was running a code in Rstan without any problems. Then, yesterday, it stopped running that code and gave me the above error:

Error in unserialize(socklist[[n]]) : error reading from connection
Error in serialize(data, node$con, xdr = FALSE) :
error writing to connection

I then updated to the newest R version (thinking that might have been an issue) and got the same error. I then thought that maybe something went wrong with my updating of R, so I uninstalled/reinstalled everything (R, Rs studio, Rtools) to no avail. I keep getting the same error code when I reinstall and run the code. Rstan does run other codes, just not this one which made me think I had altered something in my code by accident. But, when I run my code on another computer, it runs fine. It isn’t the code, it is something in my computer or how it is interfacing with Rstan.

I have seen some threads on this topic, but maybe I don’t quite understand the bigger picture of what this error code means and how I should try to troubleshoot it.

Thanks for any suggestions.

The bigger picture is that some of the Stan threads crashed without giving out a nice error - this is typical of crashes within the C++ code. This might even be a bug in Stan, but is often more inocuous (e.g. running out of memory). To troubleshoot, you should definitely run with chains=1 and possibly also verbose=TRUE. Also if you have custom C++ code, the crash could be occurring there.

Martin,

Thank you for your reply. I switched to another workstation where everything runs fine for now. I tried running with verbose=TRUE and it spit out a lot of stuff, but then R crashes immediately and I can’t see what any of it is.

If it is a memory issue, how could I deal with that? Also, I have reinstalled Rtools and R several times. Shouldn’t that fix any bugs with the C++ code?

Thanks again,

Jonathan

Sorry for the late reply, I was out of office.

Use less memory :-) A simple way is to run fewer chains in parallel. One way is to also store less data per iteration - note that by default all members of generated quantitites and transformed parameters are stored for each iteration and this can get big. You can either use parameters in call to sampling or rstan to avoid storing some of those or you can move some of the transformed parameters to the model block (variables declared in the model block are not stored]