# Generating a correlation matrix where off-diagonals are not extremely low

I am trying to estimate covariance between some data that are functions of estimated parameters and other data that is input into the model, but I’m having problems probably because any method I can find to propose correlation matrices gives me very low off-diagonal values.

The covariance matrix I’m estimating estimates covariance between six columns in a matrix. Four columns are estimated while 2 are fixed. All the columns are normalized so that covariance and correlation are interchangeable.

The two fixed columns are highly correlated, and every time I can recover the correct correlation between them.

However, the model forces the correlation between those two fixed columns and all other columns down to zero.

This is unsurprising to me when I look at the priors on the matrices. I tried out a couple of different methods for generating correlation matrices and they seem to consistently generate very low off-diagonal prior values.

In the following model:

``````    data {
int<lower=0> N; // number of observations
int<lower=0> x[N]; // outcome variable
int R;//this is 5
}
parameters {
real lambda;
}
model {
x ~ poisson_log(lambda);
}
generated quantities {
corr_matrix[R] Omega01;
corr_matrix[R] Omega0;
corr_matrix[R] Omega1;
corr_matrix[R] Omega2;

Omega01 = lkj_corr_rng(R,.5);
Omega0 = lkj_corr_rng(R,.9);
Omega1 = lkj_corr_rng(R,1);
Omega2 = lkj_corr_rng(R,2);
}
``````

it doesn’t matter what the eta argument to `lkj_corr_rng` is, the off-diagonal values are consistently low, generally between -0.1 and 0.1.

The same occurs when I try a couple of other methods to generate correlation matrices.

For the following method, both Omega and Sigma have off-diagonals with medians between -0.05 and 0.05:

``````data {
int<lower=0> N; // number of observations
int<lower=0> x[N]; // outcome variable
int R;//this is 5
}
parameters {
real lambda;
}
model {
x ~ poisson_log(lambda);
}
generated quantities {
cov_matrix[R] Sigma;
corr_matrix[R] Omega;
vector<lower=0>[R] sigma;

for (i in 1:R){
sigma[i] = fabs(normal_rng(0,3));
}

Omega = lkj_corr_rng(R,1);

}
``````

And for various values using an inverse wishart:

``````data {
int<lower=0> N; // number of observations
int<lower=0> x[N]; // outcome variable
int R;//this is 5
}
transformed data{
matrix[R,R] identity;
identity = diag_matrix(rep_vector(1, R));
}
parameters {
real lambda;
}
model {
x ~ poisson_log(lambda);
}
generated quantities {
cov_matrix[R] inv_wishart3;
cov_matrix[R] inv_wishart4;
cov_matrix[R] inv_wishart5;
cov_matrix[R] inv_wishart6;

inv_wishart3 = inv_wishart_rng(3.0,identity);
inv_wishart4 = inv_wishart_rng(4.0,identity);
inv_wishart5 = inv_wishart_rng(5.0,identity);
inv_wishart6 = inv_wishart_rng(6.0,identity);
}
``````

I have even tried ‘brute-forcing’ my way to a correlation matrix, however, this does not create a positive definite matrix:

``````parameters {
matrix[N,N] Sigma_est;
}

``````
``````transformed parameters {

matrix[TD_N,TD_N] Sigma;
for (i in 1:TD_N){
for (j in 1:TD_N){
if(i!=j){
Sigma[i,j] =Sigma_est[i,j];
Sigma[j,i] =Sigma[i,j];//ensures matrix is symmetrical
}else{
Sigma[i,j] =1;//ones on diagonal
}
}
}
}
``````
``````model {
for (i in 1:TD_N){
for (j in 1:TD_N){
if(i!=j){
Sigma_est[i,j] ~ uniform(-1,1);
}
}
}

``````

Any ideas how I can create the sort of correlation matrix I’m looking for? Or am I taking the wrong approach to the original problem somehow?

The expectation of a LKJ prior is the identity matrix regardless of the value of its parameter. To get a lot of high correlations, you need to use a kernel function.