# Generating values from multivariate normal distribution

Dear Stan users,

We are running a correlation model to compare assessment results from two cognitive tests.

The correlation model runs fine and without errors. We are now trying to run posterior predictive checks on the model and are running into a great deal of trouble simulating data in the generated quantities block. Ultimately, we would like to generate simulated datasets of the same size are the observed dataset (47 patient, 2 observations per patient) at each iteration.

Then, we could compare the generated data against the simulated datasets and/or derive statistics (e.g., correlation from the simulated datasets) and compare them to the correlation based on the observed data.

Currently, we are able to simulate a single vector of simulated test scores per iteration. We could hard code this 47 times. However, we were wondering if there is a more efficient way of doing that.

``````model <- "
// Pearson Correlation
data {
int<lower=0> n;
vector x[n];
}
parameters {
vector mu;
vector<lower=0> sigma;

real<lower=-1,upper=1> r;
}
transformed parameters {
cov_matrix T;
// Reparameterization
T[1,1] = square(sigma);
T[1,2] = r * sigma * sigma;
T[2,1] = r * sigma * sigma;
T[2,2] = square(sigma);
}
model {
// Priors
mu ~ normal(0, 100);
sigma ~ cauchy(0,3);
r ~ uniform(-1, 1);
// Data
x ~ multi_normal(mu, T);
}
generated quantities{
vector PPC1;
vector PPC2;
vector PPC3;
PPC1 = multi_normal_rng(mu, T);
PPC2 = multi_normal_rng(mu,T);
PPC3 = multi_normal_rng(mu,T);
}"
data2 <- extract(samples)
samples<- stan(model_code=model,
data=data,
#init=myinits,  # If not specified, gives random inits
#pars=parameters,
iter=500,
warmup = 10,
chains=2,
thin=1,
cores = 4,
seed = 100
)
``````
``````generated quantities{
vector PPC1[n];
vector PPC2[n];
vector PPC3[n];
for(i 1:n) {
PPC1[i] = multi_normal_rng(mu, T);
PPC2[i] = multi_normal_rng(mu,T);
PPC3[i] = multi_normal_rng(mu,T);
}
``````

I hope I understood you right.

was a typo
`for(i in 1:n)`

ha! goes to show where I am at ;)

Thank you! That worked.