Semi-automated way of diagnosing slow run times

Hello,

I have a complex time series model with patent values at each step where the typical tree depth at each iteration is 8-10, even after long warmup times. Necessarily this means the model run is slow.

I expect a higher tree depth due to the fact that neighbouring points in my latent values are correlated.

Notwithstanding this, is there any way to try to identify specific parameters that might be contributing to the large tree depth? I have far too many parameters to look directly at pairs plots.

The best ideas I have had are:

  1. Looking at the matrix of correlations of posteriors to look for highly correlated parameters (although I may not be able to do anything about them), and
  2. Try to build some supervised ML model to predict the tree depth at each step as a function of the parameters at each time step and gain insight from this.

Alas, #2 doesn’t seem like it will be too fruitful as there isn’t that much variation in the tree depth over time.

Any other suggestions?

I may just have to live with the high tree depth.

Thanks!

1 Like

Probly just do this. And look at the correlation matrix – the different scales in a covariance will just be misleading otherwise.

When you do this generate tons of posterior draws. Estimating a covariance matrix is a noisy thing. How many parameters do you have?