Extracting likelihood

When learning Bayesian statistics, I feel like its common to see some version of the plot that shows a prior density, a density of the likelihood, and a posterior density, somewhere in between (e.g. https://goo.gl/images/Ksg7uR or

)

I was trying to build of version of this graph for one of the parameters in my model - using RStan. Getting the prior curve is quite easy (since the prior is just normal with a specific mean/sd, plotting it with stat_function), and the posterior is pretty easy (just extract the sample, and plot it with stat_density) - but is there a way to get the likelihood density? Is there something I’d need to put into generated quantities or something like that?

My first attempt was to just run a new model, with an extremely uninformative prior (like normal(0, 100), and assume that the posterior of that represents the likelihood of the data, but I’m getting a nonsensical result, where the posterior is higher than either the prior or the likelihood.

Is there a way to do this? Thanks!

Yes; it is the same as you would do if you were using LOOIC, but summed over the observations. However, in a multidimensional model, your plot against the lambda is margin is not necessarily going to line up the way it does in a unidimensional model.

1 Like

Thanks!

So just to understand – I would make an accumulating variable with something like log_lik = log_lik + normal_lpdf(Y[i] | mu[i], sigma)…and then try to find a relationship between the sample intercept and sample log_lik?

Something like that, but I don’t think it would be that illuminating.

Yea – I tried it, it is not illuminating :P

Thanks for your help!

I am sorry to bump such an old thread, but I would like to visualise the likelihood as well. I don’t understand the solution offered before:

I would make an accumulating variable with something like log_lik = log_lik + normal_lpdf(Y[i] | mu[i], sigma) …and then try to find a relationship between the sample intercept and sample log_lik?

I am also not sure why it wouldn’t be illuminating.

Could you help me? I use brms to fit a binomial model with a grouping variable.

brms has a log_lik function that returns the pointwise log likelihood contribution for each data point (for each posterior draw). You could take the row sums of that matrix to get the total log_lik (one per posterior draw).