# Extracting likelihood

#1

When learning Bayesian statistics, I feel like its common to see some version of the plot that shows a prior density, a density of the likelihood, and a posterior density, somewhere in between (e.g. https://goo.gl/images/Ksg7uR or

)

I was trying to build of version of this graph for one of the parameters in my model - using RStan. Getting the prior curve is quite easy (since the prior is just normal with a specific mean/sd, plotting it with `stat_function`), and the posterior is pretty easy (just extract the sample, and plot it with `stat_density`) - but is there a way to get the likelihood density? Is there something I’d need to put into `generated quantities` or something like that?

My first attempt was to just run a new model, with an extremely uninformative prior (like `normal(0, 100)`, and assume that the posterior of that represents the likelihood of the data, but I’m getting a nonsensical result, where the posterior is higher than either the prior or the likelihood.

Is there a way to do this? Thanks!

#2

Yes; it is the same as you would do if you were using LOOIC, but summed over the observations. However, in a multidimensional model, your plot against the lambda is margin is not necessarily going to line up the way it does in a unidimensional model.

#3

Thanks!

So just to understand – I would make an accumulating variable with something like `log_lik = log_lik + normal_lpdf(Y[i] | mu[i], sigma)`…and then try to find a relationship between the sample intercept and sample log_lik?

#4

Something like that, but I don’t think it would be that illuminating.

#5

Yea – I tried it, it is not illuminating :P