This relates to a previous thread, but it’s a separate questions so I wanted to put it somewhere else.

The original I have is below. In this model I treat X as exogenous. However, it’s not really exogenous. The mean and covariance are exogenous and then I simulate X from a multivariate normal and pass that X to Stan. This isn’t so much an issue when T is small, but for larger T Stan tends to get slower.

Naively, I think I want something like the second model where I simulate X given some mu and sigma. The problem with this is that X is not identified.

Is the original model already effectively marginalizing out X? Do I have any any other options to remove X when T is large?

Original Model

```
data {
int<lower=0> N;
int<lower=0> T;
matrix[T, N] X;
}
transformed data {
vector[T] ones_T;
ones_T = rep_vector(1, T);
}
parameters {
simplex[N] w;
real<lower=0> sigma;
}
model {
ones_T ~ normal(X * w, sigma);
}
```

Naive model that I ideally would want

```
data {
int<lower=0> N;
vector[N] mu_X;
cov_matrix[N] sigma_X;
}
parameters {
vector[N] X;
simplex[N] w;
real<lower=0> sigma;
}
model {
X ~ multi_normal(mu_X, sigma_X);
1 ~ normal(X * w, sigma);
}
```