Sparse covariance thresholding for high-dimensional variable selection

X. Jessie Jeng, Z. John Daye

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

In high dimensions, variable selection methods such as the lasso are oftenlimited by excessive variability and rank deficiency of the sample covariancematrix. Covariance sparsity is a natural phenomenon in such high-dimensional applications as microarray analysis, image processing, etc., in which a large number of predictors are independent or weakly correlated. In this paper, we propose thecovariance-thresholded lasso, a new class of regression methods that can utilize covariance sparsity to improve variable selection. We establish theoretical results, under the random design setting, that relate covariance sparsity to variable selection. Data and simulations indicate that our method can be useful in improving variable selection performances.

Original languageEnglish (US)
Pages (from-to)625-657
Number of pages33
JournalStatistica Sinica
Volume21
Issue number2
DOIs
StatePublished - Apr 2011
Externally publishedYes

Keywords

  • Consistency
  • Covariance sparsity
  • Large p small n
  • Random design
  • Regression
  • Regularization

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Sparse covariance thresholding for high-dimensional variable selection'. Together they form a unique fingerprint.

Cite this