Adding noise in a model

Hello everybody.

I’m trying to introduce some noise in a choice model. A classical model would look like:

    EV[t] = ... // some function of the data and  parameters
    choice[t] ~ categorical_logit(EV[t])

What i look for would be something like:

    mean_EV[t]  = ... // some  some function of the data and parameters 
    sigma_EV[t] = ... /// some function of the data and parameters 
    EV[t] ~ normal(mean_EV[t], sigma_EV[t]);
    choice[t] ~ categorical_logit(EV[t])

This, however, does not work. I realise that the problem is that choice[t] (data) is observed, while EV[t] is not (it’s a latent variable), and therefore that the above “model” is inappropriate. But i cannot figure out how i should code that. The only work around i found was to generate noise in the transformed data block, and then inject it in the model. But this is not a good solution, because the noise is then “frozen” for the all sampling process (instead of changing at each sample iteration, as it should).

Any help would be great.

It is not possible to add real noise (i.e. random numbers) to the model at the model or transformed parameters block, what you can do is add random variables that will be parameters of a latent function, and estimate their values together with the other parameters.
This may be tricky and drastically increase the number of parameters or generate other kinds of problems, but for some models it is pretty standard approach (e.g. “Gaussian Process Classification”, which is actually GPs with non-gaussian observations)

Thank you very much for the hint.

Why do you say that this doesn’t work? If you define the EV[t] in the parameters block then this is a well-defined Stan program that allows the latent baselines to vary from time to time around the mean_EV[t] instead of being fixed to mean_EV[t].

In practice there may be degeneracy/uncertainty issues with the introduction of that latent parameter but there’s nothing wrong about it in theory.