Mixed centered/non-centered parameterisation

Any reason you couldn’t use a weighted mixture of a centered and non-centered parameter to try to improve mixing?

Eg

data{
  int n;
}
parameters{
  vector[n] theta;
  vector[n] theta_norms;
  real theta_mu;
  real<lower=0> theta_sigma;
  vector[n] p; // Mixture model weight
}
transformed parameters{
  vector[n] theta_nc;
  vector[n] theta_use;

  theta_nc = theta_mu + theta_sigma*theta_norms; // NC
  theta_use = p .* theta + (rep_vector(1.0, n) - p).*theta_nc;

model {
  theta ~ normal(theta_mu, theta_sigma);  // Centered
  theta_norms ~ std_normal(); // Non-centered

} 

It’s not a perfect fix - you’ll inherit some part of the worst sampling.

But - to use a very loose analogy - 50% bad might still give much higher effective sample size than picking the 100% bad option

See Partial non-centered parametrizations in Stan

1 Like

Maybe take a look at @mgorinova & @matthewdhoffman’s work on automatic reparameterization, which seems similar but puts the parameterization selection as a pre-sampling step rather than allowing for uncertainty during the sampling itself.

Great - thanks for the references!