HKUST Library Institutional Repository Banner

HKUST Institutional Repository >
Computer Science and Engineering >
CSE Journal/Magazine Articles >

Please use this identifier to cite or link to this item:
Title: Learning the kernel matrix by maximizing a KFD-based class separability criterion
Authors: Yeung, Dit-Yan
Chang, Hong
Dai, Guang
Keywords: Kernel learning
Fisher discriminant
Kernel Fisher discriminant
Face recognition
Issue Date: Jul-2007
Citation: Pattern recognition, v. 40, no. 7, July 2007, p. 2021-2028
Abstract: The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods.
Rights: Pattern Recognition © copyright (2007) Elsevier. The Journal's web site is located at
Appears in Collections:CSE Journal/Magazine Articles

Files in This Item:

File Description SizeFormat
yeung.pr2007b[1].pdfpre-published version646KbAdobe PDFView/Open

Find published version via OpenURL Link Resolver

All items in this Repository are protected by copyright, with all rights reserved.