Hey,

I’m trying to figure out how to test the refactored operands_and_partials for mixed mode and I’m not sure how to get the gradients flowing all the way through both forward and reverse modes. Does anyone know of an example I can look at? What’s the paradigm here?

Thanks,

Sean

Talked about this with Bob. The basic paradigm behind the forward mode is that we want to pick one of the `fvar`

s to differentiate with respect to, and set that one’s `.d_ = 1`

. Then the dual number arithmetic will work things out to give the appropriate derivative when you do arithmetic normally on that `fvar`

. Once you’ve done that math, you can take the 2nd derivative with `f.d_.grad()`

. (Need to figure out if you just pass in `f.d_`

in a std::vector as the first arg or what).

See Fvars for some discussion of how forward mode works and some references. Sounds like Bob was able to help, but I’m also happy to go through some of the more fun structure of higher-order autodiff when I get back.

For what it’s worth, I dug around in the scalar function tests to figure out what was going on. It was test/unit/math/mix/scal/fun/hypot_test.cpp stuff that convinced me I didn’t know what was actually happening.

Those tests are small enough though that you can scratch out the math by hand without getting too stressed over things being vectors.