Does anyone know of a time series model where non-centered yields a better fit than centering? I played around with autoregressive models with sparse observations where the unobserved states would presumably be best modeled with a non-centered parameterization but I consistently see fitting problems with anything but a centered parameterization of the unobserved states.

Pretty sure non-centered has worked better for me every time for state-space time-series. I have seen multi-modality with AR + gaussian noise that didn’t go away no matter what for simple data so I don’t doubt there’s plenty that can go wrong but centered formulations seem to have much worse mixing problems.

I thought you usually recommended noncentered too?

I mostly use a centered parameterization, not from any real principle but just that they seem to fit just fine like that.

The one exception is in our risk engine, where it was just easier to model the seasonality/portfolio shock module using a multi-normal state-space model using a non-centered parameterization. Again, that fits very cleanly.

Did you remember to make the unconditional mean a parameter and the intercept a local parameter that is equal to `mu * (1 - rho)`

?

I’ve putting together a series of exercises on autoregressive time models, but the classic models that are autoregressive on the observed states, for example E[y_n] = \sum_{k = 1}^{K} \beta_k y_{n - k}, and not latent states unobserved states as is more common in an HMM.

If some of the states are unobserved then the corresponding y_n become parameters with conditional Gaussian priors amenable to a non-centering but the geometry isn’t quite right for that to be helpful. Which in hindsight perhaps isn’t surprising give than the strong constraint provided by the observed states (which also becomes an issue for fully a observed time series when K is more than 5 or so).

Thanks for the suggestions.

I see, almost everything I deal with is not directly observed so it makes sense that I’m used to non-centered being the way to go.