Please use this identifier to cite or link to this item: http://hdl.handle.net/1783.1/3179

Learning the kernel matrix by maximizing a KFD-based class separability criterion

Authors Yeung, Dit-Yan
Chang, Hong
Dai, Guang
Issue Date 2007
Source Pattern recognition, v. 40, (7), 2007, JUL, p. 2021-2028
Summary The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods. (c) 2007 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Subjects
ISSN 0031-3203
Rights Pattern Recognition © copyright (2007) Elsevier. The Journal's web site is located at http://www.sciencedirect.com/
Language English
Format Article
Access View full-text via DOI
View full-text via Web of Science
View full-text via Scopus
Find@HKUST
Files in this item:
File Description Size Format
yeung.pr2007b%5b1%5d.pdf 646.88 kB Adobe PDF