Classification via local multi-resolution projections

@article{Monnier2011ClassificationVL,
  title={Classification via local multi-resolution projections},
  author={Jean-Baptiste Monnier},
  journal={arXiv: Statistics Theory},
  year={2011}
}
We focus on the supervised binary classification problem, which consists in guessing the label $Y$ associated to a co-variate $X \in \R^d$, given a set of $n$ independent and identically distributed co-variates and associated labels $(X_i,Y_i)$. We assume that the law of the random vector $(X,Y)$ is unknown and the marginal law of $X$ admits a density supported on a set $\A$. In the particular case of plug-in classifiers, solving the classification problem boils down to the estimation of the… 

Figures from this paper

Learning from Non-iid Data: Fast Rates for the One-vs-All Multiclass Plug-in Classifiers

We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution.

Upper bounds and aggregation in bipartite ranking

TLDR
A ranking rule is designed by aggregating estimators of the regression function based on the empirical ranking risk and it is shown that this procedure is adaptive to the margin parameter and smoothness parameter and achieves the same rates as in the classification framework.

Upper bounds and aggregation in bipartite ranking Sylvain Robbiano

TLDR
This work designs a ranking rule by aggregating estimators of the regression function that achieves the same rates as in the classification framework and state a minimax lower bound that establishes the optimality of the aggregation procedure in a specific case.

A Multi-Resolution Approach for Audio Classification

TLDR
The proposed approach utilizes a multi-resolution based ensemble consisting of targeted feature extraction of approximation and detail portions of the signal under the action of multiple transforms, paired with an automatic machine learning engine for algorithm and parameter selection and the LSTM algorithm.

Multi-resolution classification techniques for PTSD detection from audio interviews.

TLDR
A set of novel techniques used for identifying PTSD in patients from audio interviews are described, which utilize a multi-resolution decomposition of the audio signal into coarser and finer scales.

References

SHOWING 1-10 OF 54 REFERENCES

Classification via local multi-resolution projections (extended version)

TLDR
This novel estimation procedure presents similar theoretical performances as the celebrated local-polynomial estimator (LPE) and thus outperforms the LPE from a computational standpoint and can reach super-fast rates under a margin assumption.

Minimax nonparametric classification - Part I: Rates of convergence

  • Yuhong Yang
  • Computer Science, Mathematics
    IEEE Trans. Inf. Theory
  • 1999
TLDR
It is shown that the two problems are in fact of the same difficulty in terms of rates of convergence under a sufficient condition, which is satisfied by many function classes including Besov (Sobolev), Lipschitz, and bounded variation.

Classification under polynomial entropy and margin assump-tions and randomized estimators

TLDR
The aim of this paper is to develop the PAC-Bayesian point of view and show how the efficiency of a Gibbs estimator relies on the weights given by the prior distribution to the balls centered at the best function in the model and associated with the pseudodistance.

Nonlinear orthogonal series estimates for random design regression

Model selection for regression on a random design

We consider the problem of estimating an unknown regression function when the design is random with values in . Our estimation procedure is based on model selection and does not rely on any prior

ON POINTWISE ADAPTIVE CURVE ESTIMATION BASED ON INHOMOGENEOUS DATA

TLDR
This work proposes a method which adapts both to the local amount of data (the design density is unknown) and to theLocal smoothness of the regression function, which allows situations with strong variations in the concentration of the observations.

Fast learning rates for plug-in classifiers

TLDR
This work constructs plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n -1 , and establishes minimax lower bounds showing that the obtained rates cannot be improved.

Optimal spatial adaptation to inhomogeneous smoothness: an approach based on kernel estimates with variable bandwidth selectors

A new v~ria~~,E_<:.~_~1~!~_s,~~~,!?E_0r estimation is proposed. The application-of this bandwidth selector leads to kernel estimates that achieve optimal rates of convergence over B£~~.£~:3.sses.

De-noising by soft-thresholding

  • D. Donoho
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1995
TLDR
The authors prove two results about this type of estimator that are unprecedented in several ways: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures.

Convergence rates for pointwise curve estimation with a degenerate design

The nonparametric regression with a random design model is considered. We want to recover the regression function at a point x where the design density is vanishing or exploding. Depending on
...