# Fitting regime switching model to binary time series

Hi, I’m new to Stan and am unsure if it’s an appropriate tool for my problem. Any suggestions?

I have a binary time series X_t that I’d like to fit. It clearly has long memory effects (high Hurst exponent), and I believe that it is generated by a Bernoulli process where the probability shifts in discrete jumps:

U_t = \left\{ \begin{array}{ll} U_{t-1} \text{ w.p. } p \\ N(0, \sigma) \text{ w.p. } 1-p \\ \end{array} \right. \\ X_t = Bernoulli(Probit(U_t))

What is a good way to fit \sigma and p to my data? I don’t see any obvious way to model the switching behavior of the latent variable U_t in Stan.

This seems to be an example of a mixture model and there is a chapter about it in the Stan User Manual. Essentially, you need to model both branches of U_t and weight the likelihood by the probability that each occurs. This is not easy, but there is a case study about the difficulties at
http://mc-stan.org/users/documentation/case-studies/identifying_mixture_models.html

1 Like

Thanks, I’ve made an attempt at the model, but I can’t figure out how to reference the state from the previous time step.

In the code below, I generate N jumps, and with probability p choose between jump_n and jump_{n-1}, but really I want a state U_n to update with probability p to jump_n or U_{n-1}:

data {
int<lower=0> N;
int<lower=0,upper=1> x[N];
}
parameters {
real<lower=0,upper=1> p;
real<lower=0> sigma;
vector[N] jumps;
}
model {
jumps ~ normal(0, sigma);

for(n in 2:N) {
target += log_mix(p, bernoulli_logit_lpmf(x[n] | jumps[n-1]), bernoulli_logit_lpmf(x[n] | jumps[n]));
}
}


The way you wrote the generative model originally, wouldn’t the Stan program be something like

data {
int<lower=0> N;
int<lower=0,upper=1> x[N];
}
parameters {
real<lower=0,upper=1> p;
real<lower=0> sigma;
vector[N] U;
}
model {
// do something when n = 1
for(n in 2:N) {
target += log_mix(p, 0.0, normal_lpdf(U[n] | 0, sigma));
target += log_mix(p, bernoulli_lpdf(x[n] | Phi(U[n - 1])), bernoulli_lpdf(x[n] | Phi(U[n])));
}
// priors on p and sigma
}


Yes, that looks more like what I want.

I just added U[1] ~ normal(0,sigma) and weak priors on p and sigma, but even with synthetic data the fit is very bad with Rhats around 5.

Not sure where to go from here… I guess I’ll read some general tutorials on Stan.

EDIT: Actually, I think the model still isn’t correct. I don’t want U_{n-1}, I want U_{n-k} where k is how many steps it’s been since the last jump.

In general it sounds like you’re trying to build something like a state-space model. You can implement them in general following the HMM pattern using the forward algorithm. There’s also an example in the model for the CJS model (there’s two states for an animal–dead or alive, and dying is one way; but the data’s censored, so state needs to be marginalized out).

The conversion to HMMs only works with finite state. Otherwise, you have to start designing custom dynamic programming algorithms to calculate likelihoods. It looks like the steps-since-last-jump thing may be an unbounded history.