Adjoint sensitivity method for stochastic differential equations

This might be useful:

Scalable Gradients for Stochastic Differential Equations

The adjoint sensitivity method scalably computes gradients of solutions to ordinary differential equations. We generalize this method to stochastic differential equations, allowing time-efficient and constant-memory computation of gradients with high-order adaptive solvers. Specifically, we derive a stochastic differential equation whose solution is the gradient, a memory-efficient algorithm for caching noise, and conditions under which numerical solutions converge. In addition, we combine our method with gradient-based stochastic variational inference for latent stochastic differential equations. We use our method to fit stochastic dynamics defined by neural networks, achieving competitive performance on a 50-dimensional motion capture dataset.

2 Likes

we should start with adjoint sensitivity solvers for ODEs! That’s why I have a strong interest in AMICI https://icb-dcm.github.io/AMICI/ that thing supports

  • events
  • adjoint solves
  • sparsity in the ODE RHS

In short: This things scales in states and parameters well.

1 Like