Difference between `_log` and `_lp` customized stan functions

I have a customized log likelihood function I want to define in stan. Since my data are constructed in a recurrent event model, so I can’t use _lpdf or _lpmf to define the likelihood function since there are multiple time to events that do not have an independent probability distribution. Nonetheless, I can write out the joint likelihood function for these multiple events.

I read through the stan manual, and it seems to me that there are two ways where I can achieve this.

real likelihood_log(...){
    ...;
    return loglikelihood;
  }

and

real likelihood_lp(...){
    ...;
    return loglikelihood;
  }

Then I can do

x_vector ~ likelihood(...);

or

target +=  likelihood_log(x_vector | ...);
target +=  likelihood_lp(x_vector | ...);

But I am wondering what is the difference between defining the customized likelihood function in the _lp and _log ways?

Thank you!

1 Like

I think the _lp postfix is prefrerred these days because _log is too ambiguous. Either way, the function can increment target internally. If you just return a scalar and use that to increment target in the model block, then it does not matter what you name your function as.

Thank you Ben! I don’t see why _log is ambiguous, could you explain a little bit about that?

In the stan reference, it says:

Functions that include sampling statements or log probability increment statements must have a name that ends in _lp

I am a little bit worried about the term log probability increment statement here in the manual since in my recurrent event case, this is not really a probability, it is actually joint likelihood.

I think you are fine. The main ambiguity is that log is also the name of a link function, so we used to have (now deprecated) things like poisson_log_log to be the log PMF of a Poisson random variable parameterized in terms of the log expectation. So, I would use the _lp postfix now but anything you can do as a prior you can also do as a prior, even if it is unnormalized and / or you are thinking about it as a function of the unknowns.

1 Like

Got it. Thank you Ben!