Please use this identifier to cite or link to this item: http://hdl.handle.net/1783.1/32202

Maximum margin clustering made practical

Authors Zhang, Kai
Tsang, Ivor HKUST affiliated (currently or previously)
Kwok, James Tin-Yau View this author's profile
Issue Date 2007
Source ACM International Conference Proceeding Series , v. 227, 2007, p. 1119-1126
Summary Maximum margin clustering (MMC) is a recent large margin unsupervised learning approach that has often outperformed conventional clustering methods. Computationally, it involves non-convex optimization and has to be relaxed to different semidefinite programs (SDP). However, SDP solvers are computationally very expensive and only small data sets can be handled by MMC so far. To make MMC more practical, we avoid SDP relaxations and propose in this paper an efficient approach that performs alternating optimization directly on the original non-convex problem. A key step to avoid premature convergence is on the use of SVR with the Laplacian loss, instead of SVM with the hinge loss, in the inner optimization subproblem. Experiments on a number of synthetic and real-world data sets demonstrate that the proposed approach is often more accurate, much faster and can handle much larger data sets.
Subjects
Language English
Format Conference paper
Access View full-text via DOI
View full-text via Scopus
Find@HKUST