# Up-dating parameters for every iteration of a Stan model (MGARCH-Dirichlet Process Mixture)

Dear Stan Community,

My question in short:
Is it possible to update the parameters of a stan model with every iteration?

My question:
I am a student trying to implement the MGARCH-DPM methodology of Maheu and Shamsi’s Nonparametric Dynamic Conditional Beta (see attachments). It involves a hierarchical model where the returns are modeled by an infinite normal mixture model with a mixed \mu parameter and mixed component B, the latter component mixes the the covariance matrix H_t. The mixing parameters, \mu and B, are generated by a Dirichlet process.
p\left(r_{t} \mid \mu, B, W, H_{t}\right)=\sum_{j=1}^{\infty} \omega_{j} N\left(r_{t} \mid \mu_{j}, H_{t}^{1 / 2} B_{j}\left(H_{t}^{1 / 2}\right)^{\prime}\right)

The hierarchical model:
\begin{array}{ll} r_{t} & \mid \phi_{t}, H_{t} \sim N\left(\xi_{t}, H_{t}^{1 / 2} \Lambda_{t}\left(H_{t}^{1 / 2}\right)^{\prime}\right), t=1, \ldots, T \\ \phi_{t} & \equiv\left\{\xi_{t}, \Lambda_{t}\right\} \mid G \sim G, \\ G \mid & \alpha, G_{0} \sim D P\left(\alpha, G_{0}\right) \\ G_{0} & \equiv N\left(\mu_{0}, D\right) \times \mathcal{W}^{-1}\left(B_{0}, \nu_{0}\right) \\ H_{t} & =\Gamma_{0}+\Gamma_{1} \odot\left(r_{t-1}-\eta\right)\left(r_{t-1}-\eta\right)^{\prime}+\Gamma_{2} \odot H_{t-1} . \end{array}

In order to implement this I want to make use of Philippe Rast’s bmgarch package to handle the Multivariate Garch process (I am opting for a DCC parametrization, deviating from the attached paper). The bmgarch package models the MGARCH process using stan (see attachments).

It would be ideal to run the entire MGARCH-DPM model in Stan, but I did find out how stan does not handle discrete sampling, which is needed for a Dirichlet Process Mixture.

Hence to optimize both the MGARCH parameters and the Dirichlet process, I thought of using Ross’ and Markwick’s dirichletprocess package to optimize for \mu and B. I wrote my own distribution ‘dirichletprocess_mvnormalgarchdpm.R’ (attached) to work with the package. This distribution takes, in addition to the regular multivariate normal prior parameters, also the covariance matrix \mathbf{H} as input as generated by bmgarch.

However, with the many iterations and samples, optimizing both processes separately is not the most efficient.

Thus I think a better approach would be to nest both processes so that for every iteration of the bmgarch optimization it can use updated parameters generated by the dirichletprocess \mu and B for the next iteration round, and vice versa. Hence, I would really like to know if updating the parameters would be possible in stan!

Any suggestions how to approach this issue are greatly appreciated.

Regards,
Floor

P.s. tagging @Bob_Carpenter as kindly suggested by @ph-rast :)

Attachments:
maheu_paper.pdf (1.7 MB)

I don’t think this would work particularly well. As much as I love DPs, and as much as I love Stan, the two don’t work well together. I also don’t know if the ‘bouncing back in forth’ between stan and dirichletprocess would work well; I suspect it would not.

You could try just using some identified mixture within Stan. In practice, DPs are often fit by just using a very large ‘max number of groups’ and fitting a particular mixture model (e.g., with a stick-breaking prior on the weights). This can be done in Stan, though it is very slow [You do not need discrete sampling for mixtures; you can marginalize out the discrete params, just as you specified in your post], and identification for mixtures within Stan can be very problem/data specific. Sometimes it takes a lot of tinkering.

I would suggest either using a non-DP mixture, and finding a way to identify it well for your data. Or choose an alternative to DPs for non-parametric models.

Some possibly helpful discussions of related topics: