Abstract
We examine Support Vector Machines from the point of view of solutions to variational problems in a reproducing kernel Hilbert space. We discuss the Generalized Comparative Kullback-Leibler Distance as a target for choosing tuning parameters in SVM's, and we propose that the Generalized Approximate Cross Validation estimate of them is a reasonable proxy for this target. We indicate an interesting relationship between the GACV and the SVM margin.
| Original language | English (US) |
|---|---|
| Pages | 12-20 |
| Number of pages | 9 |
| State | Published - 1999 |
| Externally published | Yes |
| Event | Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99) - Madison, WI, USA Duration: Aug 23 1999 → Aug 25 1999 |
Conference
| Conference | Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99) |
|---|---|
| City | Madison, WI, USA |
| Period | 8/23/99 → 8/25/99 |
ASJC Scopus subject areas
- Signal Processing
- Software
- Electrical and Electronic Engineering