Abstract: In modern data sets, the number of available variables can greatly exceed the number of observations. In this paper we show how valid confidence intervals can be constructed by approximating the inverse covariance matrix by a scaled Moore-Penrose pseudoinverse, and using the lasso to perform a bias correction. In addition, we propose random least squares, a new regularization technique which yields narrower confidence intervals with the same theoretical validity. Random least squares estimates the inverse covariance matrix using multiple low-dimensional random projections of the data. This is shown to be equivalent to a generalized form of ridge regularization. The methods are illustrated in Monte Carlo experiments and an empirical example using quarterly data from the FRED-QD database, where gross domestic product is explained by a large number of macroeconomic and financial indicators.
PhD Lunch Seminars Rotterdam
- Speaker(s)
- Didier Nibbering (Erasmus University Rotterdam)
- Date
- Tuesday, 18 April 2017
- Location
- Rotterdam