Memory not mapped

Hello, I’m fitting a Markov model to a large data set using RStan. However, before calculating the likelihood and LOO, the server runs out of memory (server has 512GB RAM). So, I save the fit and load the rds file in a separate script which only calculates the likelihood and LOO. However, I get the following error…

*** caught segfault ***
address 0x7fc3c5345b40, cause ‘memory not mapped’

Traceback:
1: vapply(x, FUN = match.arg(fun), FUN.VALUE = fun_val, item)
2: psis_apply(lw_list, “log_weights”, fun_val = numeric(S))
3: do_importance_sampling(log_ratios, r_eff = r_eff, cores = cores, method = method)
4: importance_sampling.array(log_ratios = -x, r_eff = r_eff, cores = cores, method = is_method)
5: loo.array(log_lik, r_eff = r_eff, cores = 1)
6: loo(log_lik, r_eff = r_eff, cores = 1)
An irrecoverable exception occurred. R is aborting now …

The script is as follows

`library(rstan)
library(loo)

rm(list = ls())
gc()

# Run different chains in parallel on different cores
rstan_options(auto_write = TRUE)
options(mc.cores = parallel::detectCores())
# options(mc.cores = 1)


# For reproducibility
set.seed(1111)
modelName <- 'FourState_model2'

# Load the saved model fit object
fit <- readRDS("Data/FourState_model2.rds")

# Extract the log-likelihood
log_lik <- extract_log_lik(fit, parameter_name = "log_lik", merge_chains = FALSE)

# View warnings
warnings()

# Calculate and print LOO (Leave-One-Out Cross-Validation)
# r_eff <- relative_eff(exp(log_lik), chain_id = rep(1:4, each = 1500))
r_eff <- relative_eff(exp(log_lik), chain_id = rep(1:ncol(log_lik), each = nrow(log_lik)))
loo <- loo(log_lik, r_eff = r_eff, cores = 2)
print(loo)`

Running on Ubuntu and the Rstan version is 2.32.6

Thank yoou for your help.

Hi, @icstan: sorry this didn’t get answered sooner.

I’m guessing this is from using too much memory for a big model. If so, check out this post, which has some hints on how to do loo on larger data sets.

Hi Bob, sorry for only responding now too. Thank you for the recommendation!