I’m trying to write a multivariate-outcome, AR(1) time-series, missing-data model. An additional challenge is that for my data only one component of the outcome is observable at once. The users guide section, “Missing Data & Partially Known Parameters” offers some useful ideas on how to do this, but I don’t know how to deal with the only-one-component-at-once issue. The best I could come up with is the following (bivariate only), expecting it to not work:
data {
int<lower=0> N; // number of observations
real y_obs[N]; // observed data, missing one component
real<lower=1, upper=2> obs[N]; // indicates which component is observed
}
parameters {
real alpha;
real beta;
cov_matrix[2] SIG;
}
transformed parameters {
real y_mis[N];
vector[2] y[N];
for (i in 1:N) {
if (obs[i] == 1)
y[i] = [y_obs[i], y_mis[i]]';
else
y[i] = [y_mis[i], y_obs[i]]';
}
}
model {
y[1] ~ normal(0,100);
for (n in 2:N)
y[n] ~ multi_normal(rep_vector(alpha,2)
+ rep_vector(beta,2) .* y[n-1], SIG);
}
And indeed it doesn’t work; I get “Exception: normal_lpdf: Random variable[2] is nan, but must not be nan!”, presumably because Stan gets to the first line of the model and finds that one component of y[1] is nan.
The manual section “Missing Multivariate Data” has the following code snipet:
for (n in 1:N) {
if (y_observed[n, 1] && y_observed[n, 2])
y[n] ~ multi_normal(mu, Sigma);
else if (y_observed[n, 1])
y[n, 1] ~ normal(mu[1], sqrt(Sigma[1, 1]));
else if (y_observed[n, 2])
y[n, 2] ~ normal(mu[2], sqrt(Sigma[2, 2]));
}
I’ve tried to think of a way to adapt that to my model, but since in my data y[n,1] and y[n,2] are never observed simultaneously I guessing the multi_normal part of the model would never get estimated; the two components would in effect be estimated independently (i.e. no correlation).
Perhaps this means that this can’t be done, but intuitively I’m thinking that this being a time series somehow makes it possible, since the data are correlated time-wise.
Anyway, here’s some sample data:
temp.data.R (3.4 KB)
Any hints, comments, insights, etc. are appreciated.
Thanks