All

High-dimensional vector autoregressive modeling via tensor decomposition

  • Speaker: Guodong LI (The University of Hong Kong)

  • Time: Nov 18, 2018, 10:30-11:30

  • Location: Conference Room 415, Hui Yuan 3#

The classic vector autoregressive model is a fundamental tool for multivariate time series analysis. However, it involves too many parameters for high-dimensional time series, and hence suffers from the curse of dimensionality. In this paper, we rearrange the parameter matrices of a vector autoregressive model into a tensor form, and use the tensor decomposition to restrict the parameter space in three directions. Compared with the reduced-rank regression method, which can limit the parameter space in one direction only, the proposed method dramatically improves the capability of vector autoregressive models in handling high-dimensional time series. For this method, its asymptotic properties are studied and an alternating least squares algorithm is suggested. Moreover, for the case with much higher dimension, we further assume the sparsity of three loading matrices, and the regularization method is thus considered for estimation and variable selection. An ADMM-based algorithm is proposed for the regularized method and oracle inequalities for the global minimizer are established.