Now I have a stan model and then cycle this model 100 times and finally took the mean value of 100 times as the parameter estimation result.When I completed 100 cycles, I found that some of the Stan results were significantly different from the actual parameter values.For Stan model, four chains are run, among which three chains are convergent, and there is a huge gap between one chain and the other three chains,which will lead to a big difference between the results of this cycle and the real parameters, and eventually lead to a serious deviation of the results of 100 cycles.How do I deal with this situation? Thank you for your advice!

Just to clarify, do you have a Stan model that you’re re-running 100 times, or do you have a Stan model that you’re running for 100 iterations?

I’m sorry.My goal is to run the Stan model 100 times. But some of these 100 times don’t converge.I analyzed a non-convergence case and found that three of the four chains running in this process converged, and one of them was far away from the others.What improvements do I need to make? Thank you very much.

If one of the chains is not converging, then this indicates issues with the model itself. Try reparameterising your model (more information in this section of the manual) or adjusting your priors.

Out of curiousity, why are you running the model 100 times? Are you using different data each time?

Thank you very much for your advice! Yes, I used the same random number seed to simulate different data so that the model had better results for different simulated data, so I cycled 100 times to observe the estimation effect.

Ah that makes sense. Good luck with the troubleshooting!

Thank you very much ! I do not have a deep understanding of reparameterization. If it is convenient, could you please provide me with some information?

It’s pretty much as the linked chapter says. Some posteriors can be tricky to sample, so you can reparameterise your model to give the same parameters through different constructions which are easier to sample from.