Sparse Penalized Forward Selection for Support Vector Classification

Subhashis Ghosal, Bradley Turnbull, Hao Helen Zhang, Wook Yeon Hwang

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


We propose a new binary classification and variable selection technique especially designed for high-dimensional predictors. Among many predictors, typically, only a small fraction of them have significant impact on prediction. In such a situation, more interpretable models with better prediction accuracy can be obtained by variable selection along with classification. By adding an ℓ1-type penalty to the loss function, common classification methods such as logistic regression or support vector machines (SVM) can perform variable selection. Existing penalized SVM methods all attempt to jointly solve all the parameters involved in the penalization problem altogether. When data dimension is very high, the joint optimization problem is very complex and involves a lot of memory allocation. In this article, we propose a new penalized forward search technique that can reduce high-dimensional optimization problems to one-dimensional optimization by iterating the selection steps. The new algorithm can be regarded as a forward selection version of the penalized SVM and its variants. The advantage of optimizing in one dimension is that the location of the optimum solution can be obtained with intelligent search by exploiting convexity and a piecewise linear or quadratic structure of the criterion function. In each step, the predictor that is most able to predict the outcome is chosen in the model. The search is then repeatedly used in an iterative fashion until convergence occurs. Comparison of our new classification rule with ℓ1-SVM and other common methods show very promising performance, in that the proposed method leads to much leaner models without compromising misclassification rates, particularly for high-dimensional predictors.

Original languageEnglish (US)
Pages (from-to)493-514
Number of pages22
JournalJournal of Computational and Graphical Statistics
Issue number2
StatePublished - Apr 2 2016


  • High dimension
  • Penalization
  • SVM
  • Sparsity
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Sparse Penalized Forward Selection for Support Vector Classification'. Together they form a unique fingerprint.

Cite this