Torsten v0.86 and an alternative for MPI parallelization

We just released v0.86 of Torsten, a PKPD library based on/forked from Stan. In v0.86 we seek an alternative to map_rect for PMX population models, motivated to

  • Avoid using/building boost.mpi and serialization
  • Reduce cognitive load of packing & unpacking for map_rect by providing a simplified function signature.
  • Automate (static) load balance by relieving user from “sharding”.
  • Prepare for future parallelization planned on our roadmap.

Attempting to achieve above, the MPI support is limited to ODE-based (PKPD) population models, provided through a few group solvers, based on Torsten’s own implementation of ODE integrators

  • pmx_integrate_ode_adams
  • pmx_integrate_ode_bdf
  • pmx_integrate_ode_rk45
    These integrators have same signatures as Stan’s, with additional support for time step being parameters.

The population/group version of these integrators are

  • pmx_integrate_ode_group_adams
  • pmx_integrate_ode_group_bdf
  • pmx_integrate_ode_group_rk45
    with signature
  matrix pmx_integrate_ode_group_xxx(f, real[ , ] y0, real t0,
                                       int[] len, real[ , ] ts,
                                       real[ , ] theta, real[ , ] x_r, int[ , ] x_i,
                                       real rtol, real atol, int max_step);

namely the corresponding ODE integrator args now become ragged arrays, and array len giving the record length for each subject within the population. The return is a matrix ragged column-wise.

Similarly, for PMX modeling, population solvers are provided for NONMEN-compatible input, as found in previous releases. The PMX population solvers are

  • pmx_solve_adams
  • pmx_solve_bdf
  • pmx_solve_rk45

The above MPI solvers do not use “sharding”. Currently, the population load is distributed statically and evenly.

Some example models for MPI solvers can be found in the example-models folder of the repo. In particular

Note that due to the process management conflict between Torsten’s scheme and map_rect, Torsten’s MPI solvers cannot be used when STAN_MPI is turned on.