I’m asking if it’s possible to use STAN and the HMC algo to sample the 25 parameters till the likelihood reaches the maximum (MLE)

So HMC is gonna generate samples from your posterior. If it’s an MLE you want, then you can just run an optimizer on that.

Since the PDE solving(also log-posterior) includes iterative numerical methods, I don’t know if the STAN auto-differentiation algorithm will work or not.

It’s possible it can, but the autodiff for something like this is probably a little slow. If you want to generate a couple thousand samples from your posterior and NUTS needs to take 32 HMC steps for each one (which is pretty efficient sampling), then you’re looking at 64000 model evaluations/gradient calculations. With a PDE or something you probably want to think about custom autodiff.

That said there are numerous pretty complicated functions in the Stan math library that are autodiffed. The eigenvalue solver is one.

Random sampling the 25 parameters

Hmm, Stan doesn’t really directly draw samples from it’s posterior distribution. It’s less of a guess-and-check process and more of an exploring process. Check out the Michael Betancourt or Radford Neal Intro HMC papers. There are some cool animations here: https://chi-feng.github.io/mcmc-demo/app.html#HamiltonianMC,standard

Plug into the PDEs then do numerical analysis(iteration)

Is this like an iterative sparse solver?

Calculate a set of result which can be considered as probabilities of binomial distribution

The results of the PDE are a distribution? Or get fed into a distribution as data? Can you write out what a hypothetical Stan model would look like? Do you have one that has maybe just a few parameters – 3-4 maybe instead of 25?