• Corpus ID: 165163967

Learning spectrograms with convolutional spectral kernels

@inproceedings{Shen2020LearningSW,
  title={Learning spectrograms with convolutional spectral kernels},
  author={Zheyan Shen and Markus Heinonen and Samuel Kaski},
  booktitle={AISTATS},
  year={2020}
}
We introduce the convolutional spectral kernel (CSK), a novel family of non-stationary, nonparametric covariance kernels for Gaussian process (GP) models, derived from the convolution between two imaginary radial basis functions. We present a principled framework to interpret CSK, as well as other deep probabilistic models, using approximated Fourier transform, yielding a concise representation of input-frequency spectrogram. Observing through the lens of the spectrogram, we provide insight on… 

Figures and Tables from this paper

Convolutional Spectral Kernel Learning
TLDR
An interpretable convolutional spectral kernel network is built based on the inverse Fourier transform based on Rademacher complexity, where the generalization error bounds are derived and two regularizers are introduced to improve the performance.
Deep Moment Matching Kernel for Multi-source Gaussian Processes.
TLDR
Results show GP regression with the DMM kernels is effective when applying to the standard synthetic and real-world multi-fidelity data sets.
Multi-source Deep Gaussian Process Kernel Learning
TLDR
The approximation of prior-posterior DGP can be considered a novel kernel composition which blends the kernels in different layers and have explicit dependence on the data, suggesting that data-informed approximate DGPs are a powerful tool for integrating data across sources.
Shallow and Deep Nonparametric Convolutions for Gaussian Processes
TLDR
A nonparametric process convolution formulation for GPs is introduced that alleviates weaknesses by using a functional sampling approach based on Matheron’s rule to perform fast sampling using interdomain inducing variables, and allows the covariance functions of the intermediate layers to be inferred from the data.
Conditional Deep Gaussian Processes: Multi-Fidelity Kernel Learning
TLDR
The conditional DGP model is proposed, in which the latent GPs are directly supported by the fixed lower fidelity data and the effective kernel encodes the inductive bias for true function allowing the compositional freedom.
Sample-efficient reinforcement learning using deep Gaussian processes
TLDR
This work introduces deep Gaussian processes where the depth of the compositions introduces model complexity while incorporating prior knowledge on the dynamics brings smoothness and structure, and demonstrates highly improved early sample-efficiency over competing methods.
Data-Driven Wireless Communication Using Gaussian Processes
TLDR
This paper presents a promising family of nonparametric Bayesian machine learning methods, i.e., Gaussian processes (GPs), and their applications in wireless communication due to their interpretable learning ability with uncertainty, and reviews the distributed GP with promising scalability.

References

SHOWING 1-10 OF 39 REFERENCES
Deep Kernel Learning
We introduce scalable deep kernels, which combine the structural properties of deep learning architectures with the non-parametric flexibility of kernel methods. Specifically, we transform the inputs
Harmonizable mixture kernels with variational Fourier features
TLDR
It is shown that harmonizable mixture kernels interpolate between local patterns, and that variational Fourier features offers a robust kernel learning framework for the new kernel family.
Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels
TLDR
A novel variational free-energy approach based on inter-domain inducing variables that efficiently learns the continuous-time linear filter and infers the driving white-noise process is developed, leading to new Bayesian nonparametric approaches to spectrum estimation.
Non-Stationary Spectral Kernels
TLDR
It is shown with case studies that these kernels are necessary when modelling even rather simple time series, image or geospatial data with non-stationary characteristics, and derive efficient inference using model whitening and marginalized posterior.
Variational Fourier Features for Gaussian Processes
TLDR
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
Avoiding pathologies in very deep networks
TLDR
It is shown that in standard architectures, the representational capacity of the network tends to capture fewer degrees of freedom as the number of layers increases, retaining only a single degree of freedom in the limit.
Bayesian Nonparametric Spectral Estimation
TLDR
A joint probabilistic model for signals, observations and spectra is proposed, where SE is addressed as an inference problem and Bayes' rule is applied to find the analytic posterior distribution of the spectrum given a set of observations.
Non-Stationary Gaussian Process Regression with Hamiltonian Monte Carlo
TLDR
This work proposes to infer full parameter posterior with Hamiltonian Monte Carlo (HMC), which conveniently extends the analytical gradient-based GPR learning by guiding the sampling with model gradients, and learns the MAP solution from the posterior by gradient ascent.
Sparse Gaussian Processes using Pseudo-inputs
TLDR
It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.
Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo
TLDR
This work provides evidence for the non-Gaussian nature of the posterior and applies the Stochastic Gradient Hamiltonian Monte Carlo method to generate samples, which results in significantly better predictions at a lower computational cost than its VI counterpart.
...
...