Daskalakis, C;
Dellaportas, P;
Panos, A;
(2020)
Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction.
ArXiv: Ithaca, NY, USA.
Preview |
Text
2004.01584v4.pdf Download (2MB) | Preview |
Abstract
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, using two low-rank kernel approximations based on random Fourier features and truncation of Mercer expansions. In particular, we bound the Kullback-Leibler divergence between the idealized Gaussian process and the one resulting from a low-rank approximation to its kernel. Additionally, we present strong evidence that these two approximations, enhanced by an initial automatic feature extraction through deep neural networks, outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error.
Type: | Working / discussion paper |
---|---|
Title: | Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction |
Open access status: | An open access version is available from UCL Discovery |
Publisher version: | https://arxiv.org/abs/2004.01584v4 |
Language: | English |
Additional information: | This version is the version of record. For information on re-use, please refer to the publisher’s terms and conditions. |
URI: | https://discovery-pp.ucl.ac.uk/id/eprint/10095391 |
Archive Staff Only
![]() |
View Item |