Hierarchical hidden Markov structure for dynamic correlations: the hierarchical RSDC model (revised version)
:

Philippe Charlot, Vêlayoudom Marimoutou

:
Institut Français de Pondichéry / Centre de Sciences Humaines
:2011
:
22 p.
:

USR 3330 "Savoirs et Mondes Indiens" Working papers series n°1

:

English

:---
:

(Not for sale). Available online at http://hal.archives-ouvertes.fr/hal-00605965

:---

This paper presents a new multivariate GARCH model with time-varying conditional correlation structure, which is a special case of the Regime Switching Dynamic Correlation (RSDC) of Pelletier (2006). This model which we have named Hierarchical RSDC (HRSDC), has been built with the hierarchical generalization of the hidden Markov model introduced by Fine et al. (1998). This can be viewed graphically as a tree-structure with different types of states. The former are called production states, and they can emit observations, as in the class of Markov-Switching approach. The latter are called "abstract" states. They can't emit observations but establish vertical and horizontal probabilities that define the dynamic of the hidden hierarchical structure. The main advantage of this approach, comparable to the classical Markov-Switching model, is that it improves the granularity of the regimes. Our model is also comparable to the new Double Smooth Transition Conditional Correlation GARCH model (DSTCC), a STAR approach for dynamic correlations proposed by Silvennoinen and Terasvirta (2007). The reason is that, under certain assumptions, the DSTCC and our model represent two classical competing approaches to modeling regime switching. We performed, Monte-Carlo simulations, and we applied the model to two empirical applications in studying the conditional correlations of selected stock returns. Results show that the HRSDC provides a good measure of the correlations, and possesses an interesting explanatory power.

multivariate GARCH - Dynamic correlations, regime switching, Markov chain, hidden Markov models, hierarchical hidden Markov models