Skip to content

Latest commit

 

History

History
53 lines (53 loc) · 2.1 KB

2024-04-18-abedsoltan24a.md

File metadata and controls

53 lines (53 loc) · 2.1 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
On the Nyström Approximation for Preconditioning in Kernel Machines
Kernel methods are a popular class of nonlinear predictive models in machine learning. Scalable algorithms for learning kernel models need to be iterative in nature, but convergence can be slow due to poor conditioning. Spectral preconditioning is an important tool to speed-up the convergence of such iterative algorithms for training kernel models. However computing and storing a spectral preconditioner can be expensive which can lead to large computational and storage overheads, precluding the application of kernel methods to problems with large datasets. A Nystrom approximation of the spectral preconditioner is often cheaper to compute and store, and has demonstrated success in practical applications. In this paper we analyze the trade-offs of using such an approximated preconditioner. Specifically, we show that a sample of logarithmic size (as a function of the size of the dataset) enables the Nyström-based approximated preconditioner to accelerate gradient descent nearly as well as the exact preconditioner, while also reducing the computational and storage overheads.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
abedsoltan24a
0
On the {N}yström Approximation for Preconditioning in Kernel Machines
3718
3726
3718-3726
3718
false
Abedsoltan, Amirhesam and Pandit, Parthe and Rademacher, Luis and Belkin, Mikhail
given family
Amirhesam
Abedsoltan
given family
Parthe
Pandit
given family
Luis
Rademacher
given family
Mikhail
Belkin
2024-04-18
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics
238
inproceedings
date-parts
2024
4
18