Efficient way to fill up matrix of log-likelihoods

Filling up matrix M with log-likelihoods with two loops as follows, where Y and X are both vectors, and r is a real.

``````matrix[N,J] M;
for(i in 1:N){
for(j in 1:J){
M[i,j] = neg_binomial_2_lpmf(Y[i] | X[j], r);
}
}
``````

How could this be made more efficient by vectorising?

``````transformed data{
array[N*J] y_indices ;
array[N*J] x_indices ;
int k = 0 ;
for(i in 1:N){
for(j in 1:J){
k += 1 ;
y_indices[k] = i ;
x_indices[k] = j ;
}
}
}
...
model{
matrix[N,J] M = matrix( neg_binomial_2_lpmf( Y[y_indices], X[x_indices], r ), N , J) ;
}
``````
1 Like

Would that work? Wouldnâ€™t:

``````neg_binomial_2_lpmf( Y[y_indices], X[x_indices], r )
``````

Just return a single scalar?

Bah, yes it would. đź¤¦â€Ťâ™‚ď¸Ź Forgot the lpmfs do that.

Can I clarify the way `_lmpf` functions work as I canâ€™t find any documentation on this. When the observations are given as a vector, does the function return the sum of the log probabilities?

Yep, the `lpmf` and `lupmf` functions will always return a single scalar, summing if any of the inputs are vectorised