Please use this identifier to cite or link to this item:

A Scalable Kernel-Based Semisupervised Metric Learning Algorithm with Out-of-Sample Generalization Ability

Authors Yeung, Dit-Yan View this author's profile
Chang, Hong HKUST affiliated (currently or previously)
Dai, Guang HKUST affiliated (currently or previously)
Issue Date 2008
Source Neural computation , v. 20, (11), 2008, NOV, p. 2839-2861
Summary In recent years, metric learning in the semisupervised setting has aroused a lot of research interest. One type of semisupervised metric learning utilizes supervisory information in the form of pairwise similarity or dissimilarity constraints. However, most methods proposed so far are either limited to linear metric teaming or unable to scale well with the data set size. In this letter, we propose a nonlinear metric learning method based on the kernel approach.-By applying low-rank approximation to the kernel matrix, our method can handle significantly larger data sets. Moreover, our low-rank approximation scheme can naturally lead to out-of-sample generalization. Experiments performed on both artificial and real-world data show very promising results.
ISSN 0899-7667
Rights We would like to give credit to MIT for granting us permission to repost this article.
Language English
Format Article
Access View full-text via DOI
View full-text via Web of Science
View full-text via Scopus
Files in this item:
File Description Size Format
scal1.pdf 838568 B Adobe PDF