Minimizing Negative Transfer of Knowledge in Multivariate Gaussian Processes: A Scalable and Regularized Approach

@article{Kontar2020MinimizingNT,
  title={Minimizing Negative Transfer of Knowledge in Multivariate Gaussian Processes: A Scalable and Regularized Approach},
  author={R. Kontar and G. Raskutti and S. Zhou},
  journal={IEEE transactions on pattern analysis and machine intelligence},
  year={2020}
}
  • R. Kontar, G. Raskutti, S. Zhou
  • Published 2020
  • Mathematics, Computer Science, Medicine
  • IEEE transactions on pattern analysis and machine intelligence
  • Recently there has been an increasing interest in the convolution process (CP) to construct multivariate Gaussian processes (MGP) which extend the Gaussian process (GP) to deal with multiple outputs. The CP is based on the idea of sharing latent functions across several convolutions. Despite the elegance of the CP construction, it provides new challenges that need yet to be tackled. First, even with a moderate number of outputs, model building is extremely prohibitive due to the huge increase… CONTINUE READING

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 61 REFERENCES
    Computationally Efficient Convolved Multiple Output Gaussian Processes
    183
    Distributed Gaussian Processes
    171
    Efficient Multioutput Gaussian Processes through Variational Inducing Kernels
    96
    Pairwise Meta-Modeling of Multivariate Output Computer Models Using Nonseparable Covariance Function
    7
    Gaussian Processes for Regression and Optimisation
    57
    Hierarchical Mixture-of-Experts Model for Large-Scale Gaussian Process Regression
    26
    Kernels for Vector-Valued Functions: a Review
    363
    A Unifying View of Sparse Approximate Gaussian Process Regression
    1288