• Corpus ID: 235358931

Kernel approximation on algebraic varieties

  title={Kernel approximation on algebraic varieties},
  author={Jason M. Altschuler and Pablo A. Parrilo},
Low-rank approximation of kernels is a fundamental mathematical problem with widespread algorithmic applications. Often the kernel is restricted to an algebraic variety, e.g., in problems involving sparse or low-rank data. We show that significantly better approximations are obtainable in this setting: the rank required to achieve a given error depends on the variety’s dimension rather than the ambient dimension, which is typically much larger. This is true in both high-precision and high… 

Figures and Tables from this paper

Flows, Scaling, and Entropy Revisited: a Unified Perspective via Optimizing Joint Distributions

A unified algorithmic perspective on several classical problems which have traditionally been studied in different communities is described, which leads to a simple and unified framework spanning problem formulation, algorithm development, and runtime analysis.



Algebraic Variety Models for High-Rank Matrix Completion

This work considers a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i.e. each data point is a solution to a system of polynomial equations, and proposes an efficient matrix completion algorithm that minimizes a convex or non-convex surrogate of the rank of the matrix of monomial features.


This paper proposes an efficient structured low-rank approximation method—the block basis factorization (BBF)—and its fast construction algorithm to approximate radial basis function kernel matrices and demonstrates the stability and superiority over the state-of-the-art kernel approximation algorithms.

Oblivious Sketching of High-Degree Polynomial Kernels

This work is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly, and leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel function.

Near Input Sparsity Time Kernel Embeddings via Adaptive Sampling

A near input sparsity time algorithm for sampling the high-dimensional feature space implicitly defined by a kernel transformation, and shows how its subspace embedding bounds imply new statistical guarantees for kernel ridge regression.

Power Series Kernels

We introduce a class of analytic positive definite multivariate kernels which includes infinite dot product kernels as sometimes used in machine learning, certain new nonlinearly factorizable

Memory Efficient Kernel Approximation

This paper proposes a new kernel approximation algorithm - Memory Efficient Kernel Approximation (MEKA), which considers both low-rank and clustering structure of the kernel matrix and shows that the resulting algorithm outperforms state-of-the-art low- rank kernel approximation methods in terms of speed, approximation error, and memory usage.

Open Problem: Kernel methods on manifolds and metric spaces. What is the probability of a positive definite geodesic exponential kernel?

Evidence is presented that large intervals of bandwidths exist where geodesic exponential kernels have high probability of being positive definite over finite datasets, while still having significant predictive power.

On spectral distribution of kernel matrices related to radial basis functions

The paper shows how the spectral distribution of a kernel matrix relates to the smoothness of the underlying kernel function, and discusses the analytic eigenvalue distribution of Gaussian kernels, which has important application on stably computing ofGaussian radial basis functions.

Kernel Approximation on Manifolds II: The L∞ Norm of the L2 Projector

This article addresses two topics of significant mathematical and practical interest in the theory of kernel approximation: the existence of local and stable bases and the $L_p$ boundedness of the

Kernel Approximation on Manifolds I: Bounding the Lebesgue Constant

It is established that for any compact, connected $C^\infty$ Riemannian manifold there exists a robust family of kernels of increasing smoothness that are well suited for interpolation and generate Lagrange functions that are uniformly bounded and decay away from their center at an exponential rate.