Not really. The repo above contains a modified DDSolve in which I use complex-step derivative approximation. This is equivalent to forward-sensitivity calculation. That’s why complex number support is needed.
1 Like
Building statically compiled functions will be really hard for this, since this is using staged compilation. This is why I was looking for a way to do dynamic linking to link into libjulia.
Ah I see. I’ve done something like this with Python. The Stan model declares but doesn’t implement a function
functions {
real userfunc2(vector theta);
}
then implements it in C++
template <> var userfunc2(const Eigen::Matrix<var, -1, 1>& theta, std::ostream* pstream__) {
const Eigen::Matrix<double, -1, 1> theta_val = value_of(theta);
double fa;
double *ga = new double[theta.rows()];
callpy::foo(theta.rows(), theta_val.data(), &fa, ga);
std::vector<double> std_ga(theta.rows());
std_ga.assign(ga, ga + theta.rows());
delete ga;
vector<var> theta_std(theta.data(), theta.data() + theta.rows());
return precomputed_gradients(fa, theta_std, std_ga);
}
which is just calling a C function using the Python C API
PyObject *lp = PyObject_GetAttrString(main, "lp");
PyArrayObject *lp_val = (PyArrayObject*)
PyObject_CallFunctionObjArgs(lp, np_x, NULL);
PyArray_ScalarAsCtype((PyObject*) lp_val, (void*) &c_lp);
Py_DECREF(lp);
Py_DECREF(lp_val);
which is just calling a Python function
def lp(x):
return np.sum(-x**2)
where the autograd package is used to compute gradients. The working example is over here:
I’m less familiar with Julia C API, but something equivalent should be possible.
3 Likes
Thanks. Yes, that route should be possible. I’m bookmarking this for a rainy day.