Kernels and Regularization on Graphs

  title={Kernels and Regularization on Graphs},
  author={Alex Smola and Risi Kondor},
We introduce a family of kernels on graphs based on the notion of regularization operators. This generalizes in a natural way the notion of regularization and Greens functions, as commonly used for real valued functions, to graphs. It turns out that diffusion kernels can be found as a special case of our reasoning. We show that the class of positive, monotonically decreasing functions on the unit interval leads to kernels and corresponding regularization operators. 
Regularization Kernels and Softassign
This paper analyzes the use of regularization kernels on graphs to weight the quadratic cost function used in the Softassign graph-matching algorithm and suggests that kernel combination could be a key point to address in the future.
Kernel-Based Implicit Regularization of Structured Objects
This paper proposes to extend the weighted graph regularization framework to objects implicitly defined by their kernel hereby performing the regularization within the Hilbert space associated to the kernel, which opens the door to theregularization of structured objects.
Nonparametric Transforms of Graph Kernels for Semi-Supervised Learning
An algorithm based on convex optimization for constructing kernels for semi-supervised learning that incorporates order constraints during optimization results in flexible kernels and avoids the need to choose among different parametric forms.
Binet-Cauchy Kernels
We propose a family of kernels based on the Binet-Cauchy theorem and its extension to Fredholm operators. This includes as special cases all currently known kernels derived from the behavioral
Estimating a smooth function on a large graph by Bayesian Laplacian regularisation
Theoretical results are derived that show how asymptotically optimal Bayesian regularization can be achieved under an asymPTotic shape assumption on the underlying graph and a smoothness condition on the target function, both formulated in terms of the graph Laplacian.
Combining Graph Laplacians for Semi-Supervised Learning
This work proposes to use a method which optimally combines a number of differently constructed graphs to solve an extended regularization problem which requires a joint minimization over both the data and the set of graph kernels.
High-Order Regularization on Graphs
This paper develops a discrete analogue of the Laplace-de Rham operator, which naturally generalizes the discrete La place-Beltrami operator, and can be used to define harmonic functions on arbitrary paths in a graph.
Graph Kernels by Spectral Transforms
An approach to searching over a nonparametric family of spectral transforms by using convex optimization to maximize kernel alignment to the labeled data and results in a flexible family of kernels that is more data-driven than the standard parametric spectral transforms.
Kernels on Graphs as Proximity Measures
It is observed that normalized heat-type similarity measures with log modification generally perform the best and this can potentially be useful for recommending the adoption of one or another similarity measure in a machine learning method.
Nonlocal Discrete Regularization on Weighted Graphs: A Framework for Image and Manifold Processing
A nonlocal discrete regularization framework on weighted graphs of the arbitrary topologies for image and manifold processing, which leads to a family of simple and fast nonlinear processing methods based on the weighted -Laplace operator, parameterized by the degree of regularity, the graph structure and the graph weight function.


Diffusion kernels on graphs and other discrete structures
This paper focuses on generating kernels on graphs, for which a special class of exponential kernels, based on the heat equation, are proposed, called diffusion kernels, and shows that these can be regarded as the discretization of the familiar Gaussian kernel of Euclidean space.
Diffusion Kernels on Graphs and Other Discrete Input Spaces
This paper proposes a general method of constructing natural families of kernels over discrete structures, based on the matrix exponentiation idea, and focuses on generating kernels on graphs, for which a special class of exponential kernels called diffusion kernels are proposed.
The connection between regularization operators and support vector kernels
Discrete Green's functions for products of regular graphs
Discrete Green's functions are the inverses or pseudo-inverses of combinatorial Laplacians. We present compact formulas for discrete Green's functions, in terms of the eigensystems of corresponding
Spectral Graph Theory
Eigenvalues and the Laplacian of a graph Isoperimetric problems Diameters and eigenvalues Paths, flows, and routing Eigenvalues and quasi-randomness Expanders and explicit constructions Eigenvalues
Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.
Discrete Green's Functions
D discrete Green's functions can be used to deal with diffusion-type problems on graphs, such as chip-firing, load balancing, and discrete Markov chains.
Linear Operators
Linear AnalysisMeasure and Integral, Banach and Hilbert Space, Linear Integral Equations. By Prof. Adriaan Cornelis Zaanen. (Bibliotheca Mathematica: a Series of Monographs on Pure and Applied
Normalized cuts and image segmentation
  • Jianbo Shi, J. Malik
  • Computer Science
    Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition
  • 1997
This work treats image segmentation as a graph partitioning problem and proposes a novel global criterion, the normalized cut, for segmenting the graph, which measures both the total dissimilarity between the different groups as well as the total similarity within the groups.
Segmentation using eigenvectors: a unifying view
  • Yair Weiss
  • Computer Science
    Proceedings of the Seventh IEEE International Conference on Computer Vision
  • 1999
A unified treatment of eigenvectors of block matrices based on eigendecompositions in the context of segmentation is given, and close connections between them are shown while highlighting their distinguishing features.