Minimax lower bounds for function estimation on graphs

@article{Kirichenko2017MinimaxLB,
  title={Minimax lower bounds for function estimation on graphs},
  author={Alisa Kirichenko and Harry van Zanten},
  journal={arXiv: Statistics Theory},
  year={2017}
}
We study minimax lower bounds for function estimation problems on large graph when the target function is smoothly varying over the graph. We derive minimax rates in the context of regression and classification problems on graphs that satisfy an asymptotic shape assumption and with a smoothness condition on the target function, both formulated in terms of the graph Laplacian. 

Estimating a smooth function on a large graph by Bayesian Laplacian regularisation

TLDR
Theoretical results are derived that show how asymptotically optimal Bayesian regularization can be achieved under an asymPTotic shape assumption on the underlying graph and a smoothness condition on the target function, both formulated in terms of the graph Laplacian.

Minimax Optimal Regression over Sobolev Spaces via Laplacian Regularization on Neighborhood Graphs

TLDR
It is proved that Laplacian smoothing is manifold-adaptive: if X ⊆ R is an m-dimensional manifold with m < d, then the error rate of Laplacan smoothing depends only on m, in the same way it would if X were a full-dimensional set in R.

Posterior Contraction Rates for Graph-Based Semi-Supervised Classification

TLDR
This paper studies Bayesian nonparametric estimation of a binary regression function in a semi-supervised setting, and uses unlabeled data to construct a sequence of graph-based priors over the regression function restricted to the given features.

Minimax Optimal Regression over Sobolev Spaces via Laplacian Eigenmaps on Neighborhood Graphs

TLDR
It is shown that PCR-LE achieves minimax rates of convergence for random design regression over Sobolev spaces, and that the method is manifold adaptive, that is, the situation where the design is supported on a manifold of small intrinsic dimension m.

Error analysis for denoising smooth modulo signals on a graph

  • Hemant Tyagi
  • Computer Science
    Applied and Computational Harmonic Analysis
  • 2021

General framework for projection structures

TLDR
A general framework for projection structures is developed and several inference problems within this framework are studied, which unifies a very broad class of high-dimensional models and structures, interesting and important on their own right.

Robust inference for general framework of projection structures

TLDR
The excessive bias restriction (EBR) is introduced under which the local (oracle) confidence optimality of the constructed confidence ball is established, and various adaptive minimax results over various scales follow also from the local results.

Bayesian inference in high-dimensional models

TLDR
These properties of Bayesian and related methods for several high-dimensional models such as many normal means problem, linear regression, generalized linear models, Gaussian and non-Gaussian graphical models are reviewed.

References

SHOWING 1-10 OF 15 REFERENCES

Estimating a smooth function on a large graph by Bayesian Laplacian regularisation

TLDR
Theoretical results are derived that show how asymptotically optimal Bayesian regularization can be achieved under an asymPTotic shape assumption on the underlying graph and a smoothness condition on the target function, both formulated in terms of the graph Laplacian.

Learning on Graph with Laplacian Regularization

TLDR
This work considers a general form of transductive learning on graphs with Laplacian regularization, and derive margin-based generalization bounds using appropriate geometric properties of the graph, and suggests a limitation of the standard degree-based normalization.

Kernels and Regularization on Graphs

TLDR
It is shown that the class of positive, monotonically decreasing functions on the unit interval leads to kernels and corresponding regularization operators and can be found as a special case of the reasoning.

On the Effectiveness of Laplacian Normalization for Graph Semi-supervised Learning

TLDR
The analysis reveals the limitations of the standard degree-based normalization method in that the resulting normalization factors can vary significantly within each connected component with the same class label, which may cause inferior generalization performance.

Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Linear Smoothers

TLDR
It is proved that these estimators, and more broadly all estimators given by linear transformations of the input data, are suboptimal over the class of functions with bounded variation, which means that the computationally simpler methods cannot be used for such sophisticated denoising tasks, without sacrificing statistical accuracy.

Nonparametric Bayesian label prediction on a graph

Regularization and Semi-supervised Learning on Large Graphs

We consider the problem of labeling a partially labeled graph. This setting may arise in a number of situations from survey sampling to information retrieval to pattern recognition in manifold

An Introduction To The Theory Of Graph Spectra

TLDR
An introduction to the theory of graph spectra is available in the book collection an online access to it is set as public so you can download it instantly and is universally compatible with any devices to read.

Uncertainty Quantification in the Classification of High Dimensional Data

TLDR
This paper introduces, develops algorithms for, and investigates the properties of, a variety of Bayesian models for the task of binary classification, and shows that the probit and level set approaches are natural relaxations of the harmonic function approach.

Introduction to Nonparametric Estimation

  • A. Tsybakov
  • Mathematics
    Springer series in statistics
  • 2009
TLDR
The main idea is to introduce the fundamental concepts of the theory while maintaining the exposition suitable for a first approach in the field, and many important and useful results on optimal and adaptive estimation are provided.