Alternative to Absolute Value Function

I would like to include the absolute difference of two unobservable, parameterized covariates in a model, i.e. abs(x_i - x_j), to capture differences between individuals i and j. Since x_i and x_j are parameters, the warnings pointed out in the manual apply. The only other discussion I found was on absolute value functions in the old Stan forum from a while back, and, in that different context, Michael Betancourt suggested using x * coth(alpha * x) for some large alpha, referencing this paper: http://arxiv.org/pdf/1212.4693.pdf. In my case, x might be zero, and the derivative of coth is undefined at zero, throwing me back to the problem I started out with that abs() is not continuously differentiable. Is there a reasonable alternative to using abs(x_i - x_j) I am missing, e.g. a specific soft absolute value function? Thanks already!

– Christian

Without more context: squared difference

1 Like

That worked! I had considered squared differences before but thought I might be missing an established alternative or so. Thanks a lot for getting back to me so swiftly!

1 Like

Squared difference is the established alternative because of its relation to means.

For example, the posterior mean is the parameter estimate that minimizes expected squared error. Or more simply, the sample mean is the estimator that minimizes expected squared error from the true population mean.

Minimizing absolute error corresponds to medians. But it’s a lot harder to work with.

Of course! Thanks for pointing that out. I remember having learned that in an econometrics class, but somehow it didn’t come to me in this context. It’s fun when these links in knowledge form.