Gradient estimation for the optimization() method

Hello everyone,

I am currently modelling a hierarchical logistic regression and using the .optimizing() method in pystan for finding the MAP estimates of my distribution.

My goal is then to use these estimates to find a Laplace approximation of the parameters. For this I need the gradients to approximate the Hessian by taking the outer product of gradients. However,
I seem to be unable to find an option of the .optimizing() method to return the gradients of the log likelihood. Does anyone know any methods to achieve this?

I would like to find something similar to the grad_log_prob() method which can be executed after performing HMC samples with the sampling() method. So are there any efficient options to do this in combination with the .optimizing() method?

Many thanks in advance