• Publications
  • Influence
Analysis of a Random Forests Model
  • G. Biau
  • Computer Science
    J. Mach. Learn. Res.
  • 3 May 2010
TLDR
An in-depth analysis of a random forests model suggested by Breiman (2004), which is very close to the original algorithm, and shows in particular that the procedure is consistent and adapts to sparsity, in the sense that its rate of convergence depends only on the number of strong features and not on how many noise variables are present.
Consistency of Random Forests
TLDR
A step forward in forest exploration is taken by proving a consistency result for Breiman's original algorithm in the context of additive regression models, and sheds an interesting light on how random forests can nicely adapt to sparsity.
A random forest guided tour
TLDR
The present article reviews the most recent theoretical and methodological developments for random forests, with special attention given to the selection of parameters, the resampling mechanism, and variable importance measures.
Consistency of Random Forests and Other Averaging Classifiers
TLDR
A number of theorems are given that establish the universal consistency of averaging rules, and it is shown that some popular classifiers, including one suggested by Breiman, are not universally consistent.
On the Performance of Clustering in Hilbert Spaces
TLDR
The main result states that, for an almost surely bounded space, the expected excess clustering risk is O(¿1/n) and it is argued that random projections work better than other simplistic dimension reduction schemes.
Nonparametric Spatial Prediction
Let (ℕ*)N be the integer lattice points in the N-dimensional Euclidean space. We define a nonparametric spatial predictor for the values of a random field indexed by (ℕ*)N using a kernel method. We
Sparse single-index model
TLDR
This work considers the single-index model estimation problem from a sparsity perspective using a PAC-Bayesian approach and offers a sharp oracle inequality, which is more powerful than the best known oracle inequalities for other common procedures of single- index recovery.
Functional classification in Hilbert spaces
TLDR
It is established that universal weak consistency is established of a nearest neighbor-type classifier based on n independent copies of the pair of pair (X,Y), extending the classical result of Stone to infinite-dimensional Hilbert spaces.
New insights into Approximate Bayesian Computation
TLDR
This paper analyzes the ABC procedure from the point of view of k-nearest neighbor theory and explores the statistical properties of its outputs, including some asymptotic features of the genuine conditional density estimate associated with ABC.
Supervised reconstruction of biological networks with local models
TLDR
This work introduces here a novel method which predicts whether there is an edge from a newly added vertex to each of the vertices of a known network using local models.
...
1
2
3
4
5
...