Abstract
We propose a new binary classification and variable selection technique especially designed for high-dimensional predictors. Among many predictors, typically, only a small fraction of them have significant impact on prediction. In such a situation, more interpretable models with better prediction accuracy can be obtained by variable selection along with classification. By adding an ℓ1-type penalty to the loss function, common classification methods such as logistic regression or support vector machines (SVM) can perform variable selection. Existing penalized SVM methods all attempt to jointly solve all the parameters involved in the penalization problem altogether. When data dimension is very high, the joint optimization problem is very complex and involves a lot of memory allocation. In this article, we propose a new penalized forward search technique that can reduce high-dimensional optimization problems to one-dimensional optimization by iterating the selection steps. The new algorithm can be regarded as a forward selection version of the penalized SVM and its variants. The advantage of optimizing in one dimension is that the location of the optimum solution can be obtained with intelligent search by exploiting convexity and a piecewise linear or quadratic structure of the criterion function. In each step, the predictor that is most able to predict the outcome is chosen in the model. The search is then repeatedly used in an iterative fashion until convergence occurs. Comparison of our new classification rule with ℓ1-SVM and other common methods show very promising performance, in that the proposed method leads to much leaner models without compromising misclassification rates, particularly for high-dimensional predictors.
Original language | English (US) |
---|---|
Pages (from-to) | 493-514 |
Number of pages | 22 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 25 |
Issue number | 2 |
DOIs | |
State | Published - Apr 2 2016 |
Keywords
- High dimension
- Penalization
- SVM
- Sparsity
- Variable selection
ASJC Scopus subject areas
- Statistics and Probability
- Discrete Mathematics and Combinatorics
- Statistics, Probability and Uncertainty
Fingerprint
Dive into the research topics of 'Sparse Penalized Forward Selection for Support Vector Classification'. Together they form a unique fingerprint.Datasets
-
Sparse Penalized Forward Selection for Support Vector Classification
Ghosal, S. (Contributor), Turnbull, B. (Creator), Zhang, H. H. (Creator) & Hwang, W. Y. (Creator), Taylor & Francis, 2015
DOI: 10.6084/m9.figshare.1378868.v1, https://figshare.com/articles/Sparse_Penalized_Forward_Selection_for_Support_Vector_Classification/1378868/1
Dataset
-
Sparse Penalized Forward Selection for Support Vector Classification
Ghosal, S. (Contributor), Turnbull, B. (Creator), Zhang, H. H. (Creator) & Hwang, W. Y. (Creator), figshare, 2015
DOI: 10.6084/m9.figshare.c.2054972.v1, https://figshare.com/collections/Sparse_Penalized_Forward_Selection_for_Support_Vector_Classification/2054972/1
Dataset
-
Sparse Penalized Forward Selection for Support Vector Classification
Ghosal, S. (Contributor), Turnbull, B. (Creator), Zhang, H. H. (Creator) & Hwang, W. Y. (Creator), figshare, 2015
DOI: 10.6084/m9.figshare.c.2054972, https://figshare.com/collections/Sparse_Penalized_Forward_Selection_for_Support_Vector_Classification/2054972
Dataset