I’d try the first approach you had of just making it an additive model. You can basically turn anything that looks like a constant into a regression. That is, if y[n] ~ gamma(a[n], 1)
, then y / sum(y) ~ Dirichleta(a)
.
You can turn anything into a regression. In this case, just replace the a[n]
with the result of a regression. Let’s say you have a matrix of predictors x
of size N x K
(i.e., K
covariates per entry). Then you can set up a regression
data {
int<lower=0> N;
int<lower=0> K;
matrix[N, K] x;
}
parameters {
vector[K] beta;
vector<lower=0>[N] phi;
}
transformed parameters {
simplex[N] theta = phi / sum(phi);
}
model {
vector[N] a = exp(x * beta);
phi ~ gamma(a, 1);
// implies theta ~ Dirichlet(exp(x * beta))