Hi there,
Could you please help me to understand better the 11.1 of the user guide says “Without the Jacobian adjustment, optimization returns the (regularized) maximum likelihood estimate (MLE), argmax_{θ}p ( y | θ ), the value which maximizes the likelihood of the data given the parameters, (including prior terms).” However, there is no prior term in the equation argmax(.)
.
From what I understand, MLE assumes a uniform prior, this is why 11.1 also says “Applying the Jacobian adjustment produces the maximum a posteriori estimate (MAP), the maximum value of the posterior distribution, argmax_{θ}p ( y | θ )p(θ).”. When p(θ) is a uniform distribution, the estimation is MLE. Then what does “(including prior terms)” mean? Does it mean only the bound of the prior is considered such that the MLE is regularized (i.e., constrained within the bound of the prior specified in the prior)? In other words, even if I specify a prior of bounded Gaussian or lognormal in the model, this MLE still considers the prior as a uniform distribution but using the same constraints/bound of the specified prior (e.g., the same bound of the Gaussian prior or >0 for lognormal prior). I am not sure this understanding is correct. As explained in @avehtari’s case study here, when Jacobian adjustment is FALSE, the optimization returns the MAP in the unconstrained space, so the bound of the prior is not considered - the non-linear transformation of required for the bound is not adjusted.
So if the Jacobian adjustment is applied it will return the mode of the blue PDF (the last figure of @avehtari’s case study here), if not, it will return the mode of the red PDF? As red PDF is constrained in the prior bound ([0, 1]), it is a regulized MLE in [0, 1]? However, does the red PDF consider the prior as a uniform bounded in [0, 1] or the prior specified in the model, i.e., beta(1, 1)?
Often I see (Note: in optimization, the default value is 0
, for historical reasons.) in Stan/cmdstanr manual, what is the historical reason? Are there any recommended materials for reading on this topic?
Thanks a lot.