Loading…

Linear and conic programming estimators in high dimensional errors-in-variables models

We consider the linear regression model with observation error in the design. In this setting, we allow the number of covariates to be much larger than the sample size. Several new estimation methods have been recently introduced for this model. Indeed, the standard lasso estimator or Dantzig select...

Full description

Saved in:
Bibliographic Details
Published in:Journal of the Royal Statistical Society. Series B, Statistical methodology Statistical methodology, 2017-06, Vol.79 (3), p.939-956
Main Authors: Belloni, Alexandre, Rosenbaum, Mathieu, Tsybakov, Alexandre B.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We consider the linear regression model with observation error in the design. In this setting, we allow the number of covariates to be much larger than the sample size. Several new estimation methods have been recently introduced for this model. Indeed, the standard lasso estimator or Dantzig selector turns out to become unreliable when only noisy regressors are available, which is quite common in practice. In this work, we propose and analyse a new estimator for the errors-in-variables model. Under suitable sparsity assumptions, we show that this estimator attains the minimax efficiency bound. Importantly, this estimator can be written as a second-order cone programming minimization problem which can be solved numerically in polynomial time. Finally, we show that the procedure introduced by Rosenbaum and Tsybakov, which is almost optimal in a minimax sense, can be efficiently computed by a single linear programming problem despite non-convexities.
ISSN:1369-7412
1467-9868
DOI:10.1111/rssb.12196