TY - JOUR

T1 - Adaptive estimation of vector autoregressive models with time-varying variance

T2 - Application to testing linear causality in mean

AU - Patilea, Valentin

AU - Raïssi, Hamdi

N1 - Funding Information:
Valentin Patilea gratefully acknowledges financial support from the Romanian National Authority for Scientific Research, CNCS-UEFISCDI , project PN-II-ID-PCE-2011-3-0893 .

PY - 2012/11

Y1 - 2012/11

N2 - Linear vector autoregressive (VAR) models where the innovations could be unconditionally heteroscedastic are considered. The volatility structure is deterministic and quite general, including breaks or trending variances as special cases. In this framework we propose ordinary least squares (OLS), generalized least squares (GLS) and adaptive least squares (ALS) procedures. The GLS estimator requires the knowledge of the time-varying variance structure while in the ALS approach the unknown variance is estimated by kernel smoothing with the outer product of the OLS residual vectors. Different bandwidths for the different cells of the time-varying variance matrix are also allowed. We derive the asymptotic distribution of the proposed estimators for the VAR model coefficients and compare their properties. In particular we show that the ALS estimator is asymptotically equivalent to the infeasible GLS estimator. This asymptotic equivalence is obtained uniformly with respect to the bandwidth(s) in a given range and hence justifies data-driven bandwidth rules. Using these results we build Wald tests for the linear Granger causality in mean which are adapted to VAR processes driven by errors with a nonstationary volatility. It is also shown that the commonly used standard Wald test for the linear Granger causality in mean is potentially unreliable in our framework (incorrect level and lower asymptotic power). Monte Carlo experiments illustrate the use of the different estimation approaches for the analysis of VAR models with time-varying variance innovations.

AB - Linear vector autoregressive (VAR) models where the innovations could be unconditionally heteroscedastic are considered. The volatility structure is deterministic and quite general, including breaks or trending variances as special cases. In this framework we propose ordinary least squares (OLS), generalized least squares (GLS) and adaptive least squares (ALS) procedures. The GLS estimator requires the knowledge of the time-varying variance structure while in the ALS approach the unknown variance is estimated by kernel smoothing with the outer product of the OLS residual vectors. Different bandwidths for the different cells of the time-varying variance matrix are also allowed. We derive the asymptotic distribution of the proposed estimators for the VAR model coefficients and compare their properties. In particular we show that the ALS estimator is asymptotically equivalent to the infeasible GLS estimator. This asymptotic equivalence is obtained uniformly with respect to the bandwidth(s) in a given range and hence justifies data-driven bandwidth rules. Using these results we build Wald tests for the linear Granger causality in mean which are adapted to VAR processes driven by errors with a nonstationary volatility. It is also shown that the commonly used standard Wald test for the linear Granger causality in mean is potentially unreliable in our framework (incorrect level and lower asymptotic power). Monte Carlo experiments illustrate the use of the different estimation approaches for the analysis of VAR models with time-varying variance innovations.

KW - Adaptive least squares

KW - Bahadur relative efficiency

KW - Heteroscedastic errors

KW - Kernel smoothing

KW - Linear causality in mean

KW - Ordinary least squares

KW - VAR model

UR - http://www.scopus.com/inward/record.url?scp=84862995394&partnerID=8YFLogxK

U2 - 10.1016/j.jspi.2012.04.005

DO - 10.1016/j.jspi.2012.04.005

M3 - Article

AN - SCOPUS:84862995394

SN - 0378-3758

VL - 142

SP - 2891

EP - 2912

JO - Journal of Statistical Planning and Inference

JF - Journal of Statistical Planning and Inference

IS - 11

ER -