• Publications
  • Influence
Universal approximation bounds for superpositions of a sigmoidal function
  • A. Barron
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 1 May 1993
TLDR
Approximation properties of a class of artificial neural networks are established. Expand
  • 2,288
  • 223
  • PDF
Risk bounds for model selection via penalization
Abstract Performance bounds for criteria for model selection are developed using recent theory for sieves. The model selection criteria are based on an empirical loss or contrast function with anExpand
  • 702
  • 113
  • PDF
The Minimum Description Length Principle in Coding and Modeling
TLDR
We review the principles of minimum description length and stochastic complexity as used in data compression and statistical modeling. Expand
  • 1,046
  • 89
  • PDF
Minimum complexity density estimation
TLDR
The authors introduce an index of resolvability that is proved to bound the rate of convergence of minimum complexity density estimators as well as the information-theoretic redundancy of the corresponding total description length. Expand
  • 522
  • 66
  • PDF
Information-theoretic asymptotics of Bayes methods
TLDR
In the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior. Expand
  • 433
  • 54
  • PDF
Information-theoretic determination of minimax rates of convergence
We present some general results determining minimax bounds on statistical risk for density estimation based on certain information-theoretic considerations. These bounds depend only on metric entropyExpand
  • 451
  • 54
  • PDF
The consistency of posterior distributions in nonparametric problems
We give conditions that guarantee that the posterior probability of every Hellinger neighborhood of the true distribution tends to 1 almost surely. The conditions are (1) a requirement that the priorExpand
  • 332
  • 48
  • PDF
ENTROPY AND THE CENTRAL LIMIT THEOREM
On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance
  • 327
  • 45
  • PDF
APPROXIMATION OF DENSITY FUNCTIONS BY SEQUENCES OF EXPONENTIAL FAMILIES
1. Introduction. Consider the estimation of a probability density func- tion p(x) defined on a bounded interval. We approximate the logarithm of the density by a basis function expansion consistingExpand
  • 231
  • 45
  • PDF
Information Theory and Mixing Least-Squares Regressions
  • G. Leung, A. Barron
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 1 August 2006
TLDR
We develop and analyze methods for combining estimators from various models for squared-error loss and Gaussian regression with Bayes procedures. Expand
  • 222
  • 37
  • PDF