Loading…
The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models
The classical lasso estimation for sparse, high-dimensional regression models is typically biased and lacks the oracle properties. The desparsified versions of the lasso have been proposed in the literature that attempt to overcome these drawbacks. In this paper, we propose the outliers-robust versi...
Saved in:
Published in: | Statistics (Berlin, DDR) DDR), 2023-01, Vol.57 (1), p.1-25 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The classical lasso estimation for sparse, high-dimensional regression models is typically biased and lacks the oracle properties. The desparsified versions of the lasso have been proposed in the literature that attempt to overcome these drawbacks. In this paper, we propose the outliers-robust version of the desparsified lasso for high dimensional generalized linear models. The robustness, consistency and high dimensional asymptotics are investigated rigorously in a general framework of M-estimation under potential model misspecification. The desparsification mechanism is subsequently utilized to construct the focused information criterion (FIC) thereby facilitating robust, focused model selection in high dimensions. The applications are demonstrated with the Poisson regression under robust quasilikelihood loss function. The empirical performance of the proposed methods is examined via simulations and a real data example. |
---|---|
ISSN: | 0233-1888 1029-4910 |
DOI: | 10.1080/02331888.2022.2154769 |