Sampling getting stuck with latent factor regression

Dear All,

I am trying to implement Bayesian Latent Factor Regression model from Montagna et al. (2018) in Stan for neuroimaging meta-analysis. Instead of using multiplicative gamma process prior for factor-loadings used here, I adopted row-wise Dirichlet-Laplace(DL) prior for shrinkage suggested by Ferrari and Dunson (I tried horseshoe, but it didn’t ended up well with factor models).

It seemed like DL prior worked well in simple sparse factor model. However, sampling from the model that I implemented struggles. When I checked iteration with refresh=1 option, during the warmup phase, sampling seems fast at first glance, but it is getting slower and eventually getting stuck at some iteration.
I suspect there might be a parameterisation issue, but I didn’t find a good solution yet.
Do you have any recommendations?

I attached the code below.

Thank you in advance for the comments.


functions {
  
  matrix vector_array_to_matrix(vector[] x) {
    matrix[size(x), rows(x[1])] y;
    for (m in 1:size(x))
      y[m] = x[m]';
    return y;
  }
  
}

data {
  
  int<lower=0> S; // Number of studies
  int<lower=0> R; // dimension of Z
  int<lower=0> V; // Number of Voxels

  // for Factor-loadings Matrix
  int<lower=1> P;  // number of population-level effects, in other words, number of basis
  int<lower=1> K;  // number of maximum expected factors
  
  real A;
  matrix[R, S] Z;
  matrix[V, P] Bpred;
  matrix[S, P] sum_B;
  
}

transformed data{
  // hyperparameter setup
  real a = 0.5;
}

parameters {

  // DL prior with row-wise shrinkage
  vector<lower=0>[P] tau;
  simplex[K] phi[P];
  matrix<lower=0>[P, K] psi2;
  matrix[P, K] lambda_tilde;
  
  vector<lower=0>[P] sigma;  // residual SD
  
  // residuals
  matrix[K, S] delta; // for eta
  matrix<lower=0>[P, S] zeta; // for theta
  
  // for beta
  matrix[K, R] betaT;
  
}

transformed parameters {

  matrix[K, S] eta; 
  matrix[P, S] theta; // P x S theta_i, i = 1, ..., number of studies S.
  matrix[P, K] lambda;
  {
    
    matrix[P, K] glocal_sh = (tau * rep_row_vector(1, K)) .* (sqrt(psi2) .* vector_array_to_matrix(phi));
    lambda = glocal_sh .* lambda_tilde;
  }
  
  eta = betaT*Z + delta;
  theta = lambda*eta + zeta;
  
  
}

model {
  
  to_vector(psi2) ~ exponential(0.5);
  for (p in 1:P) {
    phi[p] ~ dirichlet(rep_vector(a, K));
  }
  
  tau ~ gamma(K*a, 0.5);
  
  to_vector(lambda_tilde) ~ normal(0, 1);
  sigma ~ normal(0, 1);

  // update beta
  to_vector(betaT) ~ cauchy(0,5);
  
  // residuals
  for (s in 1:S)  zeta[:,s] ~ normal(0, sigma);
  for (s in 1:S)  delta[:,s] ~ normal(0, 1);

  // update log(x|theta)
  target += - A*(rep_row_vector(1,V)*exp(Bpred * theta))' + diagonal(sum_B * theta);
  
}

generated quantities {
  matrix[V, K] basis = Bpred*lambda;
}

Sorry, can’t help now, but maybe @bbbales2 has some time to spare?

Were you able to get a basic model working in Stan?

I assume the stepsize is going down and treedepth is going up and things are just getting super slow? Is it that the model is stopping absolutely or just becoming super duper slow?

One thing you could try is to run Stan for very short chains and see if any of the parameters are going off to infinity or doing something wild. You can also extract timestep (check the manuals – this differs by interface).

Another thing is you could generate data using known parameters and then only infer subsets of parameters to see which parameters are easy to infer and which ones are hard. For instance, first only infer betaT, and then go from there.

You could go around and tighten up the priors. Cauchy priors on betaT seem a bit wide. If you decide you need Cauchys, you can reparameterize them and they’ll sample more efficiently. Search for “Reparameterizing the Cauchy” here: Stan User’s Guide. Also check out: Fitting The Cauchy Distribution .

Check around the forums and see if you find any other factor stuff that seems like what you want. These models can be hard.

1 Like

Hi, @bbbales2. Thanks for the comment!
Also, @martinmodrak Thank you for tagging @bbbales2.

I will keep you posted with some updated works in days!

Thanks again.
Minho