On the adaptive elastic-net with a diverging number of parameters

Research output: Contribution to journalArticlepeer-review

539 Scopus citations


We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Amer. Statist. Assoc. 96 (2001) 1348-1360] and [Ann. Statist. 32 (2004) 928-961] which ensures the optimal large sample performance. Furthermore, the highdimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.

Original languageEnglish (US)
Pages (from-to)1733-1751
Number of pages19
JournalAnnals of Statistics
Issue number4
StatePublished - Aug 2009

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'On the adaptive elastic-net with a diverging number of parameters'. Together they form a unique fingerprint.

Cite this