I had a similar issue, but the CSV file was on the disk. I posted a similar R function to hack reading large files into R in another question but googling the error will bump you to this page, so I’m posting it again here.
## file: csv file with stan output.
## vars: variable names as named in the stan output
## newfile: optionally save the file in a new file.
my_read_stan_vars <- function (file, vars, newfile = NULL)
{
file <- gsub(".csv", "", file)
uncomented <- paste0(file, "_uncomented.csv")
system(paste0("sed -e '/^#/d' ", paste0(file, ".csv"), " > ",
uncomented))
post <- data.table::fread(uncomented, select = vars)
system(paste0("rm ", uncomented))
if (is.null(newfile)) {
return(as.data.frame(post))
}
else {
vroom::vroom_write(post, paste0(newfile, ".csv"), delim = ",")
}
}
It did do the trick for me.
Also, this hack can be helpful to know the names of the variables in the Stan output, especially for individual parameters:
colnames_stan_output <- function (file)
{
ufile <- gsub(".csv", "", file)
ufile <- paste0(ufile, "_uncomented.csv")
colfile <- paste0(ufile, "_cols.csv")
system(paste0("sed -e '/^#/d' ", file, " > ", ufile))
system(paste0("awk 'NR==1{print $1}' ", ufile, " > ", colfile))
cols <- system(paste0("cat ", colfile), intern = T)
cols <- unlist(strsplit(cols, ","))
system(paste0("rm ", colfile))
system(paste0("rm ", ufile))
return(cols)
}