# Uniform Convergence Rates for Lipschitz Learning on Graphs

@article{Bungert2021UniformCR, title={Uniform Convergence Rates for Lipschitz Learning on Graphs}, author={Leon Bungert and Jeff Calder and Tim Roith}, journal={ArXiv}, year={2021}, volume={abs/2111.12370} }

Lipschitz learning is a graph-based semi-supervised learning method where one extends labels from a labeled to an unlabeled data set by solving the inﬁnity Laplace equation on a weighted graph. In this work we prove uniform convergence rates for solutions of the graph inﬁnity Laplace equation as the number of vertices grows to inﬁnity. Their continuum limits are absolutely minimizing Lipschitz extensions with respect to the geodesic metric of the domain where the graph vertices are sampled from…

## 5 Citations

### Continuum Limit of Lipschitz Learning on Graphs

- MathematicsFoundations of Computational Mathematics
- 2022

A sequence of functionals are defined which approximate the largest local Lipschitz constant of a graph function and prove $\Gamma-convergence in the $L^\infty$-topology to the supremum norm of the gradient as the graph becomes denser, and compactness of the functionals which implies convergence of minimizers is shown.

### Hamilton-Jacobi equations on graphs with applications to semi-supervised learning and data depth

- Computer Science, MathematicsArXiv
- 2022

It is shown that the continuum limit of the p-eikonal equation on a random geometric graph recovers a geodesic density weighted distance in the continuum, and the p→∞ limit recovers shortest path distances.

### Research

- Computer ScienceArrive
- 2022

impact and the team lift; (5) reputation and aging effects on the success in creative markets; and (6) established patterns of socio-demographic homophily on online social media.

### Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian

- MathematicsArXiv
- 2022

In this paper we prove the first quantitative convergence rates for the graph infinity Laplace equation for length scales at the connectivity threshold. In the graph-based semi-supervised learning…

### Large data limit of the MBO scheme for data clustering: convergence of the dynamics

- MathematicsArXiv
- 2022

. We prove that the dynamics of the MBO scheme for data clustering converge to a viscosity solution to mean curvature ﬂow. The main ingredients are (i) a new abstract convergence result based on…

## References

SHOWING 1-10 OF 49 REFERENCES

### Rates of Convergence for Laplacian Semi-Supervised Learning with Low Labeling Rates

- Computer Science, MathematicsArXiv
- 2020

In the well-posed setting, quantitative error estimates of $O(\varepsilon\beta^{-1/2})$ for the difference between the solutions of the discrete problem and continuum PDE, up to logarithmic factors are proved.

### Continuum Limit of Lipschitz Learning on Graphs

- MathematicsFoundations of Computational Mathematics
- 2022

A sequence of functionals are defined which approximate the largest local Lipschitz constant of a graph function and prove $\Gamma-convergence in the $L^\infty$-topology to the supremum norm of the gradient as the graph becomes denser, and compactness of the functionals which implies convergence of minimizers is shown.

### Lipschitz regularity of graph Laplacians on random data clouds

- Mathematics, Computer ScienceSIAM J. Math. Anal.
- 2022

This paper proves high probability interior and global Lipschitz estimates for solutions of graph Poisson equations, and obtains high probability and approximate convergence rates for the convergence of graph Laplacian eigenvectors towards eigenfunctions of the corresponding weighted Laplace-Beltrami operators.

### A continuum limit for the PageRank algorithm

- Mathematics, Computer ScienceEuropean Journal of Applied Mathematics
- 2020

A new framework for rigorously studying continuum limits of learning algorithms on directed graphs is proposed and it is proved that the numerical scheme is consistent and stable and compute explicit rates of convergence of the discrete solution to the solution of the continuum limit PDE.

### Asymptotic behavior of ℓp-based Laplacian regularization in semi-supervised learning

- MathematicsArXiv
- 2016

A theoretical study of lp-based Laplacian regularization under a d-dimensional geometric random graph model, which shows that the effect of the underlying density vanishes monotonically with p, yielding a function estimate f̂ that is both smooth and non-degenerate, while remaining maximally sensitive to P.

### Algorithms for Lipschitz Learning on Graphs

- Mathematics, Computer ScienceCOLT
- 2015

This work develops fast algorithms for solving regression problems on graphs where one is given the value of a function at some vertices, and must find its smoothest possible extension to all vertices using the absolutely minimal Lipschitz extension.

### Improved spectral convergence rates for graph Laplacians on ε-graphs and k-NN graphs

- Computer Science, MathematicsApplied and Computational Harmonic Analysis
- 2022

### Properly-Weighted Graph Laplacian for Semi-supervised Learning

- Computer Science, MathematicsApplied Mathematics & Optimization
- 2019

A way to correctly set the weights in Laplacian regularization so that the estimator remains well posed and stable in the large-sample limit and it is proved that the semi-supervised learning algorithm converges to the smooth solution of a continuum variational problem that attains the labeled values continuously.

### Improved spectral convergence rates for graph Laplacians on epsilon-graphs and k-NN graphs

- Computer Science, MathematicsArXiv
- 2019

The results show that the eigenvalues and eigenvectors of the graph Laplacian converge to those of the Laplace-Beltrami operator at a rate of $O(n^{-1/(m+4)})$, up to log factors, where m is the manifold dimension and $n$ is the number of vertices in the graph.

### Analysis of $p$-Laplacian Regularization in Semi-Supervised Learning

- Computer Science, MathematicsSIAM J. Math. Anal.
- 2019

A new model is introduced which is as simple as the original model, but overcomes this restriction, and it is proved that the minimizers of the discrete functionals in random setting converge uniformly to the desired continuum limit.