Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function
@article{Leshno1993MultilayerFN, title={Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function}, author={Moshe Leshno and Vladimir Ya. Lin and Allan Pinkus and Shimon Schocken}, journal={New York University Stern School of Business Research Paper Series}, year={1993} }
1,536 Citations
NEURAL NETWORKS FOR OPTIMAL APPROXIMATION OF SMOOTH
- Mathematics, Computer Science
- 1996
We prove that neural networks with a single hidden layer are capable of providing an optimal order of approximation for functions assumed to possess a given number of derivatives, if the activation…
Neural Networks for Optimal Approximation of Smooth and Analytic Functions
- Mathematics, Computer ScienceNeural Computation
- 1996
We prove that neural networks with a single hidden layer are capable of providing an optimal order of approximation for functions assumed to possess a given number of derivatives, if the activation…
Three-Layer Feedforward Structures Smoothly Approximating Polynomial Functions
- Computer ScienceICANN
- 2010
This paper considers a structure of three-layer feedforward networks that approximate polynomial functions and shows that the obtained feedforward network smoothly approximates the polynometric function.
On smooth activation functions
- Mathematics
- 1997
We had earlier constructed neural networks which are capable of providing optimal approximation rates for smooth target functions. The activation functions evaluated by the principal elements of…
A Single Hidden Layer Feedforward Network with Only One Neuron in the Hidden Layer Can Approximate Any Univariate Function
- Computer ScienceNeural Computation
- 2016
This work constructs algorithmically a smooth, sigmoidal, almost monotone activation function providing approximation to an arbitrary continuous function within any degree of accuracy.
On the Approximation Properties of Neural Networks
- Computer ScienceArXiv
- 2019
This work improves upon existing results in the literature by significantly relaxing the required assumptions on the activation function and by providing a better rate of approximation of a two layer neural network as the number of neurons increases.
Simultaneous approximations of multivariate functions and their derivatives by neural networks with one hidden layer
- Mathematics, Computer ScienceNeurocomputing
- 1996
Approximation to continuous functionals and operators using adaptive higher-order feedforward neural networks
- Computer ScienceIJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)
- 1999
Universal approximation theorems of AHFNN to continuous functionals and continuous operators are given, and learning algorithms based on the steepest descent rule are derived to tune the free parameters in NAF as well as connection weights between neurons.
Some negative results for single layer and multilayer feedforward neural networks
- Computer Science
- 2018
This work proves a negative result for approximation of functions defined con compact subsets of $\mathbb{R}^d$ with single layer feedforward neural networks with arbitrary activation functions, and claims the existence of learning functions f(x) which are as difficult to approximate with these neural networks as one may want.
References
SHOWING 1-10 OF 53 REFERENCES
Approximation capabilities of multilayer feedforward networks
- Computer ScienceNeural Networks
- 1991
Approximating and learning unknown mappings using multilayer feedforward networks with bounded weights
- Computer Science1990 IJCNN International Joint Conference on Neural Networks
- 1990
It is shown that feedforward networks having bounded weights are not undesirable restricted, but are in fact universal approximators, provided that the hidden-layer activation function belongs to one…
Multilayer feedforward networks are universal approximators
- Computer Science, MathematicsNeural Networks
- 1989
Approximation by superpositions of a sigmoidal function
- Computer ScienceMath. Control. Signals Syst.
- 1989
In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functionals can uniformly approximate any continuous function ofn real…
On the approximate realization of continuous mappings by neural networks
- Computer ScienceNeural Networks
- 1989
There exists a neural network that does not make avoidable mistakes
- Computer ScienceIEEE 1988 International Conference on Neural Networks
- 1988
The authors show that a multiple-input, single-output, single-hidden-layer feedforward network with (known) hardwired connections from input to hidden layer, monotone squashing at the hidden layer…
Theory of the backpropagation neural network
- Computer Science
- 1989
A speculative neurophysiological model illustrating how the backpropagation neural network architecture might plausibly be implemented in the mammalian brain for corticocortical learning between nearby regions of the cerebral cortex is presented.
Approximation by superpositions of a sigmoidal function
- MathematicsMath. Control. Signals Syst.
- 1992
The reduction of multidimensional density to one-dimensional density as in the proof of Lemma 1 had previously been obtained by Dahmen and Micchelli, using the same techniques, in work on ridge regression.