#### Filter Results:

- Full text PDF available (49)

#### Publication Year

1984

2019

- This year (2)
- Last 5 years (24)
- Last 10 years (43)

#### Supplemental Content

#### Publication Type

#### Co-author

#### Journals and Conferences

Learn More

- Andrew R. Barron
- IEEE Trans. Information Theory
- 1993

Approximation properties of a class of artificial neural networks are established. It is shown that feedforward networks with one layer of sigmoidal nonlinearities achieve integrated squared error of… (More)

- Andrew R. Barron, Jorma Rissanen, Bin Yu
- IEEE Trans. Information Theory
- 1998

We review the principles of minimum description length and stochastic complexity as used in data compression and statistical modeling. Stochastic complexity is formulated as the solution to optimum… (More)

- Andrew R. Barron, Thomas M. Cover
- IEEE Trans. Information Theory
- 1991

The authors introduce an index of resolvability that is proved to bound the rate of convergence of minimum complexity density estimators as well as the information-theoretic redundancy of the… (More)

We present some general results determining minimax bounds on statistical risk for density estimation based on certain information-theoretic considerations. These bounds depend only on metric entropy… (More)

- Bertrand S. Clarke, Andrew R. Barron
- IEEE Trans. Information Theory
- 1990

In the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior.… (More)

- Andrew R. Barron
- Machine Learning
- 1991

AbstractFor a common class of artificial neural networks, the mean integrated squared error between the estimated network and a target function f is shown to be bounded by
$${\text{O}}\left(… (More)

1. Introduction. Consider the estimation of a probability density func- tion p(x) defined on a bounded interval. We approximate the logarithm of the density by a basis function expansion consisting… (More)

- Andrew R. Barron
- 1986

On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance

We give conditions that guarantee that the posterior probability of every Hellinger neighborhood of the true distribution tends to 1 almost surely. The conditions are (1) a requirement that the prior… (More)

- Jonathan Q. Li, Andrew R. Barron
- NIPS
- 1999

Gaussian mixtures (or so-called radial basis function networks) for density estimation provide a natural counterpart to sigmoidal neural networks for function fitting and approximation. In both… (More)