Hello, I have a theoretical question in Bayesian analysis I was hoping to get help with.

To the best of my understanding, the value of the prior has a weight equivalent to that of a single sample in determining the value of the posterior in the simple Bayes formula. That is, if the likelihood is a factor of N i.i.d samples, the numerator of of the Bayes formula is the multiplication of N+1 factors: the prior and N additional probability values.

My question is whether I am correct, and moreover - whether that is also the case in Bayesian statistics, i.e., that the prior is important as a single sample in determining the value of the posterior?

Thank you!

Amir

It is true that for many models, the likelihood factorizes as \mathcal{L} = \prod l_i, where l is the pointwise likelihood.

But that doesnâ€™t lead to a natural notion of â€śimportanceâ€ť. For example, I can write that the posterior is the (normalized) product of the prior and the likelihood \mathcal{L}, yielding an expression for the (unnormalized) posterior with just two terms, one of which is the prior. So now by your reasoning it looks like the prior is as important as all of the pointwise likelihoods put together.

The flaw in this reasoning is that your intuitive notion of â€śimportanceâ€ť probably depends not only on how many terms get multiplied together, but also on how informative each of those terms is. So for example if the prior is vague and the likelihood is very informative, then the likelihood dominates. If the prior is very strong and the likelihood is very diffuse, then the prior dominates.

Typically, the pointwise likelihoods are very diffuse, often much more diffuse than the prior. In such a setting, it would be natural to say that the prior is more important than any single sample. It is also frequently the case that the prior is more diffuse than the product \mathcal{L} of the pointwise likelihoods, and so it would be natural to say that the likelihood is more important than the prior.

2 Likes