Multiple priors for the same parameter

Hello:

I have multiple sources available to construct informative priors for a single parameter. I was wondering whether it is logically correct to directly specify multiple priors in Stan model.

I tried specifying multiple priors in Stan and there was no error during the modeling. To check, I ran the model with individual prior 1, prior 2 and both prior 1+prior 2, and the results show that specifying two priors on the same parameter does yield results between those from individually specifying prior 1 and prior 2. Does Stan implicitly do a weighted mixture of multiple priors for the same parameter like 1:1 here?

To be more specific, I have more multiple priors for a parameter of interest (e.g., mean) from different information sources available, and I implemented in Stan as:

model {
 mu ~ normal(mu0, sigma0)       // prior from information source 1
 mu ~ uniform(a, b)             // prior from information source 2
 y ~ normal (mu, sigma)         // likelihood
}

Could anyone tell me if this is correct to do like this?

Many thanks.

Hi! :)

I don’t know if what you are doing is “correct” - I would need more context and domain expertise for that. But you might want to check out this blog post.

1 Like

Hi Max,

Thank you very much for your prompt reply. I have editted my post to give more specifics.

It appears that the post by Gelman does apply multiple priors for the same parameters to some extent, i.e. an extra prior on the combination of parameters of interest. This appears to be relevant to my problem, but I am not sure whether the way I am doing here by putting multiple priors on the exact same parameter rather than on individual parameters and a combination of them is appropriate.

Thanks again!

Hi, I would wait for others to chime in, but logarithmic pooling is a good way to combine multiple priors into one.

It seems the main papers that talk about logarithmic pooling are pay walled, so drop me a line via private message so I can share the material, if you’re interested.