Some of the models I fit quite regularly have fairly straightforward analytical gradients but are very expensive to evaluate via autodiff. I know it’s in theory possible to hack together my own version of Stan with these analytical gradients defined on the back end, but I’d probably mess that up. Something like this would be extremely handy:

data {
...
}
parameters {
real par1;
real par2;
}
par1 = some expression;
// gradient wrt par2 to be left to autodiff
}
model {
...
}


This isn’t much of a question, but would something like this be possible in theory?

1 Like

Can you wrap it into its own function and then code the gradients by hand? You don’t need to recompile Stan to do this:
https://cran.r-project.org/web/packages/rstan/vignettes/external.html

I’ve done this for some parts of a model that have easy analytical gradients and are evaluated in the inner loop of the likelihood function.

Can you provide an example?
I would be nice to have an example prototype about how to extend Stan in C++.

Sure.

I needed to calculate \frac x {1 - x} or \exp(\mbox{logit}(x)) in a hot path, but could not find it in the Stan math library. It has derivative \frac 1 {(1-x)^2}

So I added this code to a separate .hpp

and included it as described in the tutorial.

inline double exp_logit(double u) {
return u / (1 - u);
}

inline var exp_logit(const var& u) {
const double uv = u.val();
const double m_u = (1-uv);
const double m_u_sq = m_u * m_u;

return var(new precomp_v_vari(exp_logit(u.val()),
u.vi_,
1 / m_u_sq));
}
struct exp_logit_fun {
template <typename T>
static inline T fun(const T& x) {
return exp_logit(x);
}
};

template <typename T>
inline typename apply_scalar_unary<exp_logit_fun, T>::return_t
exp_logit(const T& x) {
return apply_scalar_unary<exp_logit_fun, T>::apply(x);
}

template <typename T0__>
Eigen::Matrix<typename boost::math::tools::promote_args<T0__>::type,
Eigen::Dynamic,1>
exp_logit(const Eigen::Matrix<T0__, Eigen::Dynamic,1>& x, std::ostream*
pstream__) {
return exp_logit(x);
}

struct exp_logit_functor__ {
template <typename T0__>
Eigen::Matrix<typename boost::math::tools::promote_args<T0__>::type,
Eigen::Dynamic,1>
operator()(const Eigen::Matrix<T0__, Eigen::Dynamic,1>& x, std::ostream*
pstream__) const {
return exp_logit(x, pstream__);
}
};


This is pretty much all the Stan C++ code I’ve ever written, so by no means take this to be best practices!

3 Likes

but while waiting for them, what @aaronjg did doesn’t seem that complicated (given that the process of including additional .hpp is not too complicated for you)

I’d trust 0% of users to keep the gradients block properly updated as they evolve their model but I can see the attraction.

User’s autodiff could be cross-checked by Stan’s version at initial phase.

2 Likes

And for random draws after sampling stops. This is also in the plan.

1 Like

We have to trust our users. There’s so many ways to screw up a Stan program that we can’t really provide much protection.

I’d trust a fair bit more than 0% because some users are careful with software and test as they go.

And as others have noted, we wouldn’t do this without finite diff or autodiff tests.

What we’re likely to add is a way to define a function with gradients, not just define gradients w.r.t. the log density for parameters.

Then users will only define gradients w.r.t. constrained parameters, so we’ll need to chain the transforms and Jacobians for any constrained parameters.

@Bob_Carpenter That sounds even better than a gradients block. Exciting!

1 Like

Hi Bob,
I just found this from almost three years ago. Will this still happen? (soon? eventually?)

It’s something we talk about and this idea is reasonably popular but there’s no work on it as of yet :D.

2 Likes

I really look forward to be able to define something like a _grad function for my custom _lpdf/lpmf function. And not have to deal with c++.
I can’t code something like this, but I volunteer to help with the testing and documentation :)

3 Likes

I don’t think there is, but if you ever get curious about the status just ask again. Hopefully it’ll be a yes in the future sometime :P

Hey, is there a way to include external c++ code using the new pystan 3 interface?

Currently, no.

Is there a plan to add support for this?

I don’t think Stan has any ‘official’ way to do this, so answer is probably no.

1 Like

Yeah I don’t know any way to do this other than the rstan stuff above or modifying sources and rebuilding manually.

The research project I am working on will require custom gradients. Is there a guide or some instructions on how to modify source and recompile for use with PyStan 3? Thanks in advance for any answers