In this paper we consider the conservative Lasso which we argue penalizes more correctly than the Lasso and show how it may be desparsified in the sense of van de Geer et al. (2014) in order to construct asymptotically honest (uniform) confidence bands. In particular, we develop an oracle inequality for the conservative Lasso only assuming the existence of a certain number of moments. This is done by means of the Marcinkiewicz-Zygmund inequality which in our context provides sharper bounds than Nemirovski’s inequality. We allow for heteroskedastic non-subgaussian error terms and covariates. Next, we desparsify the conservative Lasso estimator and derive the asymptotic distribution of tests involving an increasing number of parameters. As a stepping stone towards this, we also provide a feasible uniformly consistent estimator of the asymptotic covariance matrix of an increasing number of parameters which is robust against conditional heteroskedasticity. To our knowledge we are the first to do so. Next, we show that our confidence bands are honest over sparse high-dimensional sub vectors of the parameter space and that they contract at the optimal rate. All our results are valid in high-dimensional models and the number of parameters involved in our tests can diverge to infinity. Our simulations reveal that the desparsified conservative Lasso estimates the parameters more precisely than the desparsied Lasso, has better size properties and produces confidence bands with superior coverage rates. Joint with Mehmet Caner.
Keywords and phrases: conservative Lasso, honest inference, high-dimensional data, uniform inference, condence intervals, tests.