I am running a HMC sampling model to sample over a number of parameters based on a normal distribution with a predicted mean. The main prediction is done using the disp_eq_root_par function (which works fine and has been tested with known values to return correct results).

The model runs fine without any initial values, however, as soon as I set some I immediately return a segmentation fault, after the gradient evaluation has taken place.

The initial values are sensible values for the regression model given the data points as they are material properties and so are estimated to be of the correct magnitude.

The code is written in such a way as to be able to parallelise using map_rect.

```
data {
int<lower=0> N;
real<lower=0> x[N];
real<lower=0> y[N];
}
transformed data{
// non-parallel
int xi[0];
vector[0] theta;
}
parameters {
real<lower=0> log_C[4];
real<lower=0> log_rho;
real<lower=0> log_sigma;
}
transformed parameters {
real<lower=0> C[4];
real<lower=0> rho;
real<lower=0> sigma;
for (i in 1:4) { C[i] = exp(log_C[i]); }
rho = exp(log_rho);
sigma = exp(log_sigma);
}
model {
// Set parameters
real h = 1e-3;
// MCMC model
vector[N] ypred;
print(C);
print(rho);
print(sigma);
real a0_C = 4.9;
real b0_C = 0.07;
C[1] ~ gamma(a0_C, b0_C);
C[2] ~ gamma(a0_C, b0_C);
C[3] ~ gamma(a0_C, b0_C);
C[4] ~ gamma(12.5,2.5);
rho ~ normal(1600.0, 300.0);
sigma ~ gamma(2.0, 2.5e-5);
vector[6] phi = [C[1],C[2],C[3],C[4],rho,h]';
ypred = disp_eq_root_par(phi, theta, x, xi);
y ~ normal(ypred, sigma);
}
```

From the cmd folder, the code is compiled with:

â€śâ€ť"

make ~/dispersion_stan/dispersion USER_HEADER=~/dispersion_stan/dispersion_eigs.hpp

â€śâ€ť"

And then from the ~dispersion_stan/ folder, the compiled binary is ran with:

â€śâ€ť"

./dispersion sample num_samples=50 num_warmup=20 init=init_vals_log.json data file=dispersion.data_0deg.json output file=dispersion_output.csv

â€śâ€ť"

Where `init=init_vals_log.jsonâ€™ is or isnâ€™t included.

Here is the output from the terminal when initial values are included:

â€śâ€ť"

method = sample (Default)

sample

num_samples = 50

num_warmup = 20

save_warmup = 0 (Default)

thin = 1 (Default)

adapt

engaged = 1 (Default)

gamma = 0.050000000000000003 (Default)

delta = 0.80000000000000004 (Default)

kappa = 0.75 (Default)

t0 = 10 (Default)

init_buffer = 75 (Default)

term_buffer = 50 (Default)

window = 25 (Default)

algorithm = hmc (Default)

hmc

engine = nuts (Default)

nuts

max_depth = 10 (Default)

metric = diag_e (Default)

metric_file = (Default)

stepsize = 1 (Default)

stepsize_jitter = 0 (Default)

id = 0 (Default)

data

file = dispersion.data_0deg.json

init = init_vals_log.json

random

seed = 598831642 (Default)

output

file = dispersion_output.csv

diagnostic_file = (Default)

refresh = 100 (Default)

sig_figs = -1 (Default)

profile_file = profile.csv (Default)

Gradient evaluation took 0.088989 seconds

1000 transitions using 10 leapfrog steps per transition would take 889.89 seconds.

Adjust your expectations accordingly!

WARNING: There arenâ€™t enough warmup iterations to fit the

three stages of adaptation as currently configured.

Reducing each adaptation stage to 15%/75%/10% of

the given number of warmup iterations:

init_buffer = 3

adapt_window = 15

term_buffer = 2

Segmentation fault

â€śâ€ť"

If possible, add also code to simulate data or attach a (subset of) the dataset you work with.

Any help on this would be much appreciated!

- Operating System: WSL Ubuntu 20.04
- CmdStan Version: 2.27
- Compiler/Toolkit: g++ and gnu make