We modeled long memory with just one lag!
Luc Bauwens  1@  , Guillaume Chevillon  2@  , Sébastien Laurent  3, *@  
1 : Center for Operations Research and Econometrics  (CORE)  -  Website
2 : ESSEC Business School
ESSEC Business School
3 : Aix-Marseille University
Aix- Marseille School of Economics, Aix-Marseille University
* : Corresponding author

A large dimensional network or system can generate long memory in its components, as shown by Chevillon, Hecq and Laurent (2018, CHL) and Schennach (2018). These authors derive conditions under which the variables generated by an infinite dimensional vector autoregressive model of order 1, a VAR(1), exhibit long memory. We go one step further and show how these theoretical results can be put to practice for modeling and forecasting series with long range dependence that belong to a large network or system. We estimate the VAR(1) model equation by equation, by shrinking the parameters to generic conditions matching those of CHL and Schennach, by ridge and Bayesian estimations. We consider two large-dimensional applications where long memory has long been an established observation. Our proposal significantly outperforms ARFIMA and HAR models when forecasting a non-parametric estimate of the log of the integrated variance of 250 assets, as well as seasonally adjusted historic monthly streamflow series recorded in 97 locations of the Columbia river basin.


Online user: 1 Privacy
Loading...