Dimension Reduction for High Dimensional Vector Autoregressive Models

Cubadda GianlucaHecq Alain
CEIS Research Paper
This paper aims to decompose a large dimensional vector autoregessive (VAR) model into two components, the first one being generated by a small-scale VAR and the second one being a white noise sequence. Hence, a reduced number of common components generates the entire dynamics of the large system through a VAR structure. This modelling, which we label as the dimension-reducible VAR, extends the common feature approach to high dimensional systems, and it differs from the dynamic factor model in which the idiosyncratic component can also embed a dynamic pattern. We show the conditions under which this decomposition exists. We provide statistical tools to detect its presence in the data and to estimate the parameters of the underlying small-scale VAR model. Based on our methodology, we propose a novel approach to identify the shock that is responsible for most of the common variability at the business cycle frequencies. We evaluate the practical value of the proposed methods by simulations as well as by an empirical application to a large set of US economic variables.
 

Download from REPEC

Download from SSRN



Number: 534
Keywords: Vector autoregressive models, dimension reduction, reduced-rank regression, multivariate autoregressive index model, common features, business cycle shock.
Volume: 20
Issue: 2
Date: Thursday, March 24, 2022
Revision Date: Thursday, March 24, 2022