Please use this identifier to cite or link to this item:

Making large-scale Nyström approximation possible

Authors Li, Mu HKUST affiliated (currently or previously)
Kwok, James Tin-Yau View this author's profile
Lü, Baoliang
Issue Date 2010
Source ICML 2010 - Proceedings, 27th International Conference on Machine Learning , 2010, p. 631-638
Summary The Nyström method is an efficient technique for the eigenvalue decomposition of large kernel matrices. However, in order to ensure an accurate approximation, a sufficiently large number of columns have to be sampled. On very large data sets, the SVD step on the resultant data submatrix will soon dominate the computations and become prohibitive. In this paper, we propose an accurate and scalable Nyström scheme that first samples a large column subset from the input matrix, but then only performs an approximate SVD on the inner submatrix by using the recent randomized low-rank matrix approximation algorithms. Theoretical analysis shows that the proposed algorithm is as accurate as the standard Nyström method that directly performs a large SVD on the inner submatrix. On the other hand, its time complexity is only as low as performing a small SVD. Experiments are performed on a number of large-scale data sets for low-rank approximation and spectral embedding. In particular, spectral embedding of a MNIST data set with 3.3 million examples takes less than an hour on a standard PC with 4G memory. Copyright 2010 by the author(s)/owner(s).
ISBN 978-160558907-7
Language English
Format Conference paper
Access View full-text via Scopus
Files in this item:
File Description Size Format
icml10.pdf 969250 B Adobe PDF