# Propagating uncertainty across models

Hi,

I am trying to fit the model below where I propagate uncertainty about the predictors through the model. I’ve a working version in which I fit the models separately and plug in the central tendency for xbar but obviously that’s unsatisfactory. Do you have any advice on how to speed this up?

``````data {
int J;
int G;
matrix[J, G] xbar1;
matrix[J, 3] ybar;
}
parameters {
matrix<lower = 0, upper = 1>[G,3] beta;
simplex[G] xbar2[J];
simplex[G] xbar3[J];
vector<lower = 0>[3] sigma;
}
model {
matrix[J, G] xbar2_mat;
matrix[J, G] xbar3_mat;
for (i in 1:3) target += normal_lpdf(beta[:,1] | 0.5, 0.25);
for (j in 1:J){
xbar2[j,:] ~ dirichlet(softmax(xbar1[j]' .* beta[:,1])); //softmax(xbar1[j]' .* beta[:,1])
xbar3[j,:] ~ dirichlet(softmax(xbar2[j] .* beta[:,2])); //softmax(xbar2[j]' .* beta[:,2])
xbar2_mat[j] = xbar2[j,:]';
xbar3_mat[j] = xbar3[j,:]';
}
target += normal_lpdf(sigma | 0, 2);
target += normal_lpdf(ybar[:,1] | xbar1 * beta[:,1], sigma[1]);
target += normal_lpdf(ybar[:,2] | xbar2_mat * beta[:,2], sigma[2]);
target += normal_lpdf(ybar[:,3] | xbar3_mat * beta[:,3], sigma[3]);
}
generated quantities {
matrix[J, 3] y_bar_hat;
y_bar_hat[,1] = to_vector(normal_rng(xbar1 * beta[:,1], sigma[1]));
for (j in 1:J){
y_bar_hat[j,2] = normal_rng(xbar2[j]' * beta[:,2], sigma[2]);
y_bar_hat[j,3] = normal_rng(xbar3[j]' * beta[:,3], sigma[3]);
}
}
``````
1 Like

Ended up figuring this out myself. The code below runs reasonable fast (a minute on my machine per chain with about 3k observations).

``````data {
int J;
int G;
matrix[J, G] xbar1;
matrix[J, 3] ybar;
}
transformed data {
vector[G] ones_column = rep_vector(1, G);
row_vector[G] ones_row = rep_vector(1, G)';
}
parameters {
matrix<lower = 0, upper = 1>[G,3] beta;
vector<lower = 0>[3] sigma;
}
model {
matrix[J, G] xbar2;
matrix[J, G] xbar3;
xbar2 = (xbar1 .* rep_matrix(beta[:,1]', J));
xbar2 = xbar2 ./ (xbar2 * ones_column * ones_row);
xbar3 = (xbar2 .* rep_matrix(beta[:,2]', J));
xbar3 = xbar3 ./ (xbar3 * ones_column * ones_row);
for (i in 1:2) target += normal_lpdf(beta[:,i] | 0.5, 0.25);
target += normal_lpdf(sigma | 0, 0.5);
target += normal_lpdf(ybar[:,1] | xbar1 * beta[,1], sigma[1]);
target += normal_lpdf(ybar[:,2] | xbar2 * beta[,2], sigma[2]);
target += normal_lpdf(ybar[:,3] | xbar3 * beta[,3], sigma[3]);
}
``````

I set up a hierarchical version which is again terribly slow. If anyone has any input on how to speed up the model below, I’d be grateful.

``````data {
int J;
int G;
int S;
int s[J];
matrix[J, G] xbar1;
matrix[J, 3] ybar;
}
transformed data {
vector[G] ones_column = rep_vector(1, G);
row_vector[G] ones_row = rep_vector(1, G)';
}
parameters {
matrix<lower = 0, upper = 1>[G,3] beta_mean;
matrix<lower = 0>[G,3] beta_sigma;
matrix[S,G] raw_beta[3];
vector<lower = 0>[3] sigma;
}
transformed parameters {
matrix[S, G] beta[3];
for (g in 1:G){
for (j in 1:3) {
beta[j,,g] = beta_mean[g, j] + raw_beta[j, ,g] * beta_sigma[g, j];
}
}
}
model {
matrix[J, G] xbar2;
matrix[J, G] xbar3;
xbar2 = (xbar1 .* beta[1, s]);
xbar2 = xbar2 ./ (xbar2 * ones_column * ones_row);
xbar3 = (xbar2 .* beta[2, s]);
xbar3 = xbar3 ./ (xbar3 * ones_column * ones_row);
for (j in 1:3) to_vector(raw_beta[j]) ~ std_normal();
to_vector(beta_mean) ~ normal_lpdf(0, 0.5);
to_vector(beta_sigma) ~ normal_lpdf(0.25, 0.25);
target += normal_lpdf(ybar[:,1] | rows_dot_product(xbar1, beta[1,s]), sigma[1]);
target += normal_lpdf(ybar[:,2] | rows_dot_product(xbar2, beta[2,s]), sigma[2]);
target += normal_lpdf(ybar[:,3] | rows_dot_product(xbar3, beta[3,s]), sigma[3]);
}
``````
1 Like

Sorry we took long to respond.

That is a bit hard to say in general. Maybe it is just difficult - how big are `J, G, S` in your case? I don’t think I understand what the model is intended to do, but I have some minor notes:

I am not sure I understand what you are trying to achieve here, would that just be `sum(xbar2)` (or `sum(to_vector(xbar2))` if `sum` is not available for matrices…

I think this form of assignment where the target of the assignemnet is also on the right hand side can be slow-ish (but not extremely). Using a new variable could help.

I would check the `pairs` plot for some subsets of the parameters, it is possible you are introducing some weird correlations in you parameters (the centering is somewhat suspicous, but I don’t really understand what’s going on, so I might be wrong).

Best of luck with your model!

Hi Martin, thanks for having a look. This is supposed to be a multilayer ecological regression model. Imagine wanting to know whom specific groups vote for while only having the composition of eligible voters at some geographical unit plus the vote share of the winner. Essentially, it goes predict the individual probability to turn out. Infer what the group of people who turned out looks like. Use your distribution of how that group looks like to estimate how they likely voted. I use it to estimate absentee ballot rejection rates by ethnicity. (xbar2 * ones_column * ones_row) This step presses each row into a sum to one constraint. Surprisingly enough, it can recover the parameters from fake data relatively easily.

1 Like