Please use this identifier to cite or link to this item: http://hdl.handle.net/1783.1/72673

Normalization of Linear Support Vector Machines

Authors Feng, Yiyong HKUST affiliated (currently or previously)
Palomar, Daniel P. View this author's profile
Issue Date 2015
Source IEEE Transactions on Signal Processing , v. 63, (17), September 2015, article number 7120173, p. 4673-4688
Summary In this paper, we start with the standard support vector machine (SVM) formulation and extend it by considering a general SVM formulation with normalized margin. This results in a unified convex framework that allows many different variations in the formulation with very diverse numerical performance. The proposed unified framework can capture the existing methods, i.e., standard soft-margin SVM, l(1)-SVM, and SVMs with standardization, feature selection, scaling, and many more SVMs, as special cases. Furthermore, our proposed framework can not only provide us with more insights on different SVMs from the "energy" and "penalty" point of views, which help us understand the connections and differences between them in a unified way, but also enable us to propose more SVMs that outperform the existing ones under some scenarios.
Subjects
ISSN 1053-587X
Language English
Format Article
Access View full-text via DOI
View full-text via Web of Science
View full-text via Scopus
Find@HKUST