Discussion on "Doubly sparsity kernel learning with automatic variable selection and data extraction"

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Kernel methods provide powerful and flexible tools for nonlinear learning in high dimensional data analysis, but feature selection remains a challenge in kernel learning. The proposed DOSK method provides a new unified framework to implement kernel methods while automatically selecting important variables and identifying a subset of parsimonious knots at the same time. A double penalty is employed to encourage sparsity in both feature weights and representer coefficients. The authors have presented the computational algorithm and as well as theoretical properties of the DOSK method. In this discussion, we first highlight the DOSK's major contributions to the machine learning toolbox. Then we discuss its connections to other nonparametric methods in the literature and point out some possible future research directions. AMS 2000 subject classifications: Primary 62H20, 62F07; secondary 62J05.

Original languageEnglish (US)
Pages (from-to)425-428
Number of pages4
JournalStatistics and its Interface
Issue number3
StatePublished - 2018


  • High dimensional data analysis
  • Kernel methods
  • Penalty
  • Reproducing kernel Hilbert space (RKHS)
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Applied Mathematics


Dive into the research topics of 'Discussion on "Doubly sparsity kernel learning with automatic variable selection and data extraction"'. Together they form a unique fingerprint.

Cite this