How to perform a regression analysis when assuming partially ordered coefficient?

Doing a meta-analysis with regression on three covariates x_1, x_2, x_3, I want to include the prior information we have about the order of the covariates coefficients.

Model
My model is a regression model with a random intercept:
k_i \sim Bin(n_i,p_i) (data)
p_i = \text{logit}^{-1}(\alpha_i + \beta_1 \cdot x_{1,i}+\beta_2 \cdot x_{2,i}+\beta_3 \cdot x_{3,i})
with \alpha_i \sim \mathcal{N}(\alpha,\sigma^2) (random intercept)

We know that \beta_1>\beta_3 and \beta_2>\beta_3. As it is partial ordering, I cannot declare the vector \beta :=(\beta_1,\beta_2,\beta_3) as an ordered vector. I was therefore thinking of declaring the coefficients \beta as

parameters {
  real beta_1; 
  real beta_2;
  real < upper = min_beta_1_2 > beta_3;
} 
transformed parameters {
vector[2] beta_1_2;
beta_1_2[1] = beta_1;
beta_1_2[2] = beta_2;
real min_beta_1_2 = min(beta_1_2); 
}

Q1: Is it a correct formulation?

Then, I thought I could write it in another way, putting the constraint on \beta_1 and \beta_2:

parameters {
  real  beta_3;
  real < lower = beta_3 > beta_1; 
  real < lower = beta_3 >beta_2;
} 

Q2 : Is there any difference between the two models in terms of results, computation time and “convergence”? If yes, which one should I use?

You should use the second one, with unconstrained beta_3. The difference is that the minimum function has a discontinuous derivative and will likely slow down computation.

1 Like