Hi all,

I’m playing around with the `reduce_sum`

function to try to take advantage of the within-chain parallelization for our hierarchical IRT model. Our ultimate goal is to get the model up and running with an MPI, but I figured that this would be a good place to start. Here’s the model:

```
data{
int N;
int I;
int J;
int K;
int T;
int<lower=0,upper=1> judg[N];
int asp[N];
int asp_par[J];
int country[N];
int year[N];
int comp[N];
}
parameters{
real theta[I,J,T];
matrix<lower=0>[J,K] beta;
matrix[J,K] alpha;
vector[K] sigma_alpha;
vector[K] sigma_beta;
}
model{
// Likelihood
for(n in 1:N){
judg[n] ~ bernoulli_logit(beta[asp[n], comp[n]] *
(theta[country[n], asp[n], year[n]] - alpha[asp[n], comp[n]]));
}
// Priors
// theta
// A draw for the first year for each aspect
for(j in 1:J){
theta[,j,1] ~ std_normal();
}
// Loop over aspects
for(j in 1:J){
// This year's ideal point is drawn from a distribution centered on last year's
for(t in 2:T){
theta[,j,t] ~ normal(theta[,j,t-1], 1);
}
}
// beta
// Loop over composers
for(k in 1:K){
beta[1,k] ~ normal(0, sigma_beta[k]); // Draw for first node
for(j in 2:J){
beta[j,k] ~ normal(beta[asp_par[j],k], sigma_beta[k]);
// Draw from dist. centered on parent
}
}
// alpha
for(k in 1:K){
alpha[1,k] ~ normal(0, sigma_alpha[k]); // Draw for first node
for(j in 2:J){
alpha[j,k] ~ normal(alpha[asp_par[j],k], sigma_alpha[k]);
// Draw from dist. centered on parent
}
}
sigma_alpha ~ normal(0,5);
sigma_beta ~ normal(0,5);
}
}
```

This model runs well, but we’re trying to speed it up. I tried to use `reduce_sum`

to do so. Here are the functions and model blocks from my (probably poor) attempt after following the red card example:

```
functions{
real partial_sum(int[] slice_judg,
int start, int end,
int[] asp,
int[] asp_par,
int[] country,
int[] year,
int[] comp,
real[,,] theta,
real[,] beta,
real[,] alpha
) {
return binomial_logit_lpmf(slice_judg | beta[asp[start:end], comp[start:end]] *
(theta[country[start:end], asp[start:end], year[start:end]] -
alpha[asp[start:end], comp[start:end]]));
}
}
```

…

```
model{
target += reduce_sum(partial_sum, judg, grainsize,
asp, asp_par, country, year, comp,
theta, beta, alpha);
```

The model is failing to compile, and I’m fairly certain that the most pressing issue is in the functions block where I’m declaring the parameters that go in the `bernoulli_logit_lpmf`

. Can anyone point me in the right direction? Let me know if you need any other info from me!