Date
Tue, 23 Jan 2024
Time
14:00 - 14:30
Location
L6
Speaker
Paz Fink Shustin
Organisation
Oxford

Gaussian processes provide a powerful probabilistic kernel learning framework, which allows high-quality nonparametric learning via methods such as Gaussian process regression. Nevertheless, its learning phase requires unrealistic massive computations for large datasets. In this talk, we present a quadrature-based approach for scaling up Gaussian process regression via a low-rank approximation of the kernel matrix. The low-rank structure is utilized to achieve effective hyperparameter learning, training, and prediction. Our Gauss-Legendre features method is inspired by the well-known random Fourier features approach, which also builds low-rank approximations via numerical integration. However, our method is capable of generating high-quality kernel approximation using a number of features that is poly-logarithmic in the number of training points, while similar guarantees will require an amount that is at the very least linear in the number of training points when using random Fourier features. The utility of our method for learning with low-dimensional datasets is demonstrated using numerical experiments.

Please contact us with feedback and comments about this page. Last updated on 01 Jan 2024 17:45.