Missing data in a 2PL (IRT) model

It seems that we have resolved our issue. The code that actually works is the following:

functions {
  real rsm(int y, real theta, real beta, vector kappa) {
  vector[rows(kappa) + 1] unsummed;
  vector[rows(kappa) + 1] probs;
  unsummed = append_row(rep_vector(0, 1), theta - beta - kappa);
  probs = softmax(cumulative_sum(unsummed));
  return categorical_lpmf(y + 1 | probs);
  }
  real rsm_rng(real theta, real beta, vector kappa) {
  vector[rows(kappa) + 1] unsummed;
  vector[rows(kappa) + 1] probs;
  unsummed = append_row(rep_vector(0, 1), theta - beta - kappa);
  probs = softmax(cumulative_sum(unsummed));
  return categorical_rng(probs);
  }
}
data {
  int<lower=1> I;               // # items
  int<lower=1> J;               // # persons
  int<lower=1> N;               // # observations
  int<lower=1> N_mis;           // # missing observations
  int<lower=1, upper=I> ii[N];  // item for n
  int<lower=1, upper=J> jj[N];  // person for n
  int<lower=0, upper=1> y[N];   // correctness for n
}
transformed data {
  int m;                        // # steps
  m = max(y);
}
parameters {
  vector[I] beta;
  vector[m-1] kappa_free;
  vector[J] theta;
  real<lower=0> sigma;
}
transformed parameters {
  vector[m] kappa;
  kappa[1:(m-1)] = kappa_free;
  kappa[m] = -1*sum(kappa_free);
}
model {
  beta ~ normal(0, 3);
  target += normal_lpdf(kappa | 0, 3);
  theta ~ normal(0, sigma);
  sigma ~ exponential(.1);
  for (n in 1:N)
    target += rsm(y[n], theta[jj[n]], beta[ii[n]], kappa);
}
generated quantities {
  vector[N_mis] y_mis;
  for (n in 1:N_mis)
    y_mis[n] = rsm_rng(theta[jj[n]], beta[ii[n]], kappa);
}

It seems that we were using the wrong argument for rsm_rng, where function “probs” should have been included, i.e. return categorical_rng(probs). Also, rsm_rng is now defined as rsm_rng(real theta, real beta, vector kappa).

All in all, we had to figure out how the random number generating function would work in this context.

Many thanks for your comments. This issue has now been resolved.

1 Like