Constructing precomputed_gradients with Eigen objects?

I’m working on interfacing Stan with a set of Fortran subroutines that compute a function and it’s gradient, and am a little stuck with trying to properly return the gradient.

So far, I’ve been trying as hard as possible to avoid copying the vector and matrix that the Fortran subroutines populate with the function and gradient values (respectively), to avoid overhead from the copy operations. The Eigen::VectorXd and Eigen::MatrixXd objects have been nice for this, since it’s pretty easy to use Eigen::Map to “wrap” them around existing memory. Using this approach, I’ve been able to successfully use the Fortran subroutines from Stan to compute and extract the function value.

From the function prototype though, it looks like precomputed_gradients wants std::vectors for it’s arguments, and as far as I can tell, it isn’t possible to construct those from already allocated memory without copying the values to the new container.

Is there a way that I can create a precomputed_gradients object directly from Eigen::VectorXd and Eigen:: Matrix XD?

You’ll need to bite the bullet and go ahead and copy.

Don’t stress about it. It probably took way more time to compute those gradients than the copy is gonna take.

Good luck with the rest of your modeling :P!

You can create a std vector without having to copy the elements. See the example here:

What is the reason for using std::vector and not std::array? The interface, yes, but why not use the faster array type? Does this imply Stan ables to use variable length parameters to functions under the hood?

Yeah, the issue is that Stan doesn’t actually know lengths of things at compile times. I asked @Bob_Carpenter something similar and he pointed out that:

data {
  int N;
  real y[N];

Is dynamically sized so messes up the compile-time sizes.