Problem with Guassian Process in R

Hi everyone
I have a problem with a model for Gaussian Process
The model is very simple (the same that appear in manual of Stan) but when I run the model in R give me the next error (with 3 chains for the model)

Error in unserialize(socklist[[n]]) : error reading from connection
Error in serialize(data, node$con, xdr = FALSE) : error writing to connection

I prove the same model with 1 chain but the R session crashing

I know that anothers persons they had the same problem but they not had solutions


This has happened to me at times as well. One of the reasons I identified is that the process runs out of memory. Maybe you are trying a too large dataset - rstan will generally require at least n_iterations * n_stored_values * 8 bytes of memory. And if you have the covaraince matrix in transformed parameters, it gets stored, so your memory requirements become quadratic. E.g. when your GP has 1000 points and you store the covariance matrix, you already allocate roughly 16GB of RAM per chain (with the default 2000 iterations).

1 Like

Thanks @martinmodrak!

My laptop has 16 GB of RAM and 4 cores, so, with a gaussian processes simuled with 500 observations and a fit of 1 chain and 300 iterations I think that it should not a problem.

I solved this problem replicating the case of study by Michael Betancourt (with a great explanation for this purpose!) but whithout specify an prior for the hyperparameters

I will keep trying