Reduce Sum with Multivariate Likelihood not updating variables

Continuing the discussion from Using _lupmf for multivariate likelihood in reduce_sum:

Hello Stanimals! When attempting to use reduce_sum with multivariate likelihood, it appears that if I pass in a dummy variable for the first argument, nothing happens in the update?

I will post a full example (in the model section, I have commented out what the model should do and what the reduce_sum is attempting to parallelize).

The full run with some data take a few minutes, but when I compile the reduce_sum version and run it, it runs almost instantly (but as I said, no update to weights seems to happen).

Can anyone see anything egregiously wrong with what I am doing?

Function:

functions {

  real partial_sum_likelihood_lpmf(
      data int [] dummy_slice,
      int start, 
      int end,
      vector omega,
      data int [,] selected_indicies,
      data int [] num_selected_events_this_block,
      data int [,] available_sku_indicies_this_block,
      data int [] number_available_skus_this_block
    ) 
  {
    real ret_val = 0.;
    for (n in start:end) {
        ret_val += categorical_logit_lupmf(
          selected_indicies[n,1:num_selected_events_this_block[n]] | 
          omega[available_sku_indicies_this_block[n,1:number_available_skus_this_block[n]]]
        );
    }
    return ret_val;
  }
  
}

data {
  int<lower=1> num_blocks;
  int<lower=1> num_skus;
  int<lower=1> max_number_selections_per_block;
  
  int<lower=0, upper=num_skus> available_sku_indicies_this_block[num_blocks, num_skus]; // padded with zeros
  int<lower=1> number_available_skus_this_block[num_blocks];
  int<lower=1> total_selections_this_block[num_blocks];
  int<lower=0,upper=num_skus> selected_indicies[num_blocks, max_number_selections_per_block]; //padded with zeros
}

transformed data {
  int<lower=1> dummy_slice[0];
  int grainsize=50;
}

parameters {
  vector[num_skus] log_weights;
}

transformed parameters {
  simplex[num_skus] weights = softmax(log_weights);
}

model {
  log_weights ~ std_normal();

  target += reduce_sum(
      partial_sum_likelihood_lpmf,
      dummy_slice,
      grainsize,
      weights,
      selected_indicies,
      total_selections_this_block,
      available_sku_indicies_this_block,
      number_available_skus_this_block
  );
  

  /*for (n in 1:num_blocks) {
      target += categorical_logit_lupmf(
          selected_indicies[n,1:total_selections_this_block[n]] | 
          weights[available_sku_indicies_this_block[n,1:number_available_skus_this_block[n]]]
      );
  }*/

}

dummy slice has length 0…and thus reduce_sum will not sum anything. You need to have the length of dummy sum aligned with your range of interest to make this work.

1 Like

So dummy needs to have size num_blocks? Okay! (This was not at all apparent btw!)

Thanks I will give it a shot!