I’m working on interfacing Stan with a set of Fortran subroutines that compute a function and it’s gradient, and am a little stuck with trying to properly return the gradient.
So far, I’ve been trying as hard as possible to avoid copying the vector and matrix that the Fortran subroutines populate with the function and gradient values (respectively), to avoid overhead from the copy operations. The Eigen::VectorXd and Eigen::MatrixXd objects have been nice for this, since it’s pretty easy to use Eigen::Map to “wrap” them around existing memory. Using this approach, I’ve been able to successfully use the Fortran subroutines from Stan to compute and extract the function value.
From the function prototype though, it looks like precomputed_gradients wants std::vectors for it’s arguments, and as far as I can tell, it isn’t possible to construct those from already allocated memory without copying the values to the new container.
Is there a way that I can create a precomputed_gradients object directly from Eigen::VectorXd and Eigen:: Matrix XD?