Corpus ID: 11198138

Thoughts on Massively Scalable Gaussian Processes

@article{Wilson2015ThoughtsOM,
  title={Thoughts on Massively Scalable Gaussian Processes},
  author={Andrew Gordon Wilson and Christoph Dann and Hannes Nickisch},
  journal={ArXiv},
  year={2015},
  volume={abs/1511.01870}
}
  • Andrew Gordon Wilson, Christoph Dann, Hannes Nickisch
  • Published 2015
  • Mathematics, Computer Science
  • ArXiv
  • We introduce a framework and early results for massively scalable Gaussian processes (MSGP), significantly extending the KISS-GP approach of Wilson and Nickisch (2015). The MSGP framework enables the use of Gaussian processes (GPs) on billions of datapoints, without requiring distributed inference, or severe assumptions. In particular, MSGP reduces the standard $O(n^3)$ complexity of GP learning and inference to $O(n)$, and the standard $O(n^2)$ complexity per test point prediction to $O(1… CONTINUE READING

    Paper Mentions

    BLOG POST

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 50 CITATIONS

    Kernel Distillation for Gaussian Processes

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Large Linear Multi-output Gaussian Process Learning

    VIEW 5 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    When Gaussian Process Meets Big Data: A Review of Scalable GPs

    VIEW 1 EXCERPT
    CITES METHODS

    Deep Kernel Learning

    VIEW 13 EXCERPTS
    CITES METHODS & BACKGROUND

    Learning Scalable Deep Kernels with Recurrent Structure

    VIEW 15 EXCERPTS
    CITES METHODS & BACKGROUND

    A Scalable Hierarchical Gaussian Process Classifier

    VIEW 1 EXCERPT
    CITES METHODS

    FILTER CITATIONS BY YEAR

    2016
    2020

    CITATION STATISTICS

    • 7 Highly Influenced Citations

    • Averaged 10 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 33 REFERENCES

    Distributed Gaussian Processes

    VIEW 1 EXCERPT

    Sparse Gaussian Processes using Pseudo-inputs

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Efficient variational inference in large-scale Bayesian compressed sensing

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    A la Carte - Learning Fast Kernels

    VIEW 2 EXCERPTS