Geert Dhaene (KU Leuven, Belgium)
Polynomial shrinkage of large-dimensional covariance matrices
We derive an optimal rule for shrinking large-dimensional sample covariance matrices under Frobenius loss. The rule generalizes the optimal linear shrinkage rule of Ledoit and Wolf (Journal of Multivariate Analysis, 2004) to broader parametric families of rules. The families include, for example, polynomial and piecewise linear rules. The oracle version of the optimal rule is very simple and attains the lower bound on the Frobenius loss in finite samples. Feasible versions attain the lower bound under large-dimensional asymptotics where p=n ! c > 0, but are not generally available in closed form. The polynomial family of rules, however, admits a closed-form estimator of the optimal rule. We derive it using results from random matrix theory and an algorithm to calculate Wishart moments of arbitrary order. In settings that have been studied earlier, we find that polynomial shrinkage substantially reduces the Frobenius loss compared to linear shrinkage. Polynomial shrinkage is conceptually easy, does not require non-convex optimization in high dimension, and also allows p > n. Joint with Nicolas Tavernier.
Frank Windmeijer (Bristol University, United Kingdom)
A simple underidentification test for linear IV models, with an application to dynamic panel data models
Abstract: For linear IV models it is shown that standard underidentification tests like the Cragg-Donald and Kleibergen-Paap tests are Sargan type tests for instrumenterror orthogonality in the linear IV model of one endogenous explanatory variable regressed on the others, estimated by LIML. This insight is then used to extend these type of tests to more complex data structures, like dynamic panel data models, leading to very simple to calculate tests for underidentification.