Please use this identifier to cite or link to this item:

Smooth and locally linear semi-supervised metric learning

Authors Ruan, Yang
Issue Date 2009
Summary Many algorithms in pattern recognition and machine learning make use of some distance function explicitly or implicitly to characterize the relationships between data instances. Choosing a suitable distance function for a given problem at hand thus plays a very crucial role in delivering satisfactory performance. The goal of metric learning is to automate the design of the distance function (a metric or pseudometric in particular) by learning it automatically from data. We study in this thesis a metric learning problem in which some supervisory information is available for the data in semi-supervised learning setting, and propose a metric learning method called constrained moving least squares (CMLS). Specifically, CMLS performs locally linear transformation which varies smoothly across the instance space as guaranteed by the moving least squares approach. Learning the transformation can be cast as a convex optimization problem with optimality guarantee, and the transformation thus obtained induces a pseudometric space. We demonstrate the effectiveness of CMLS via a synthetic problem for illustration as well as some classification and clustering tasks using UCI and other real-world image databases.
Note Thesis (M.Phil.)--Hong Kong University of Science and Technology, 2009
Language English
Format Thesis
Access View full-text via DOI
Files in this item:
File Description Size Format
th_redirect.html 341 B HTML
Copyrighted to the author. Reproduction is prohibited without the author’s prior written consent.