Multilayer feedforward networks are universal approximators

@article{Hornik1989MultilayerFN,
  title={Multilayer feedforward networks are universal approximators},
  author={Kurt Hornik and Maxwell B. Stinchcombe and Halbert L. White},
  journal={Neural Networks},
  year={1989},
  volume={2},
  pages={359-366}
}

Neural networks for localized approximation

We prove that feedforward artificial neural networks with a single hidden layer and an ideal sigmoidal response function cannot provide localized approximation in a Euclidean space of dimension

Neural networks with a continuous squashing function in the output are universal approximators

A constructive method for multivariate function approximation by multilayer perceptrons

TLDR
It is shown how to construct a perceptron with two hidden layers for multivariate function approximation, which can perform function approximation in the same manner as networks based on Gaussian potential functions, by linear combination of local functions.

Constructive Approximation of Discontinuous Functions by Neural Networks

TLDR
A constructive proof that a real, piecewise continuous function can be almost uniformly approximated by single hidden-layer feedforward neural networks (SLFNNs) is given.

Function approximation using a partition of the input space

  • P. Koiran
  • Computer Science, Mathematics
    [Proceedings 1992] IJCNN International Joint Conference on Neural Networks
  • 1992
TLDR
It is shown that a simple geometric proof of this theorem can be extended to networks of units using a smooth output function, and a recent result on the approximation of polynomials by fixed size networks is improved.

NEURAL NETWORKS FOR OPTIMAL APPROXIMATION OF SMOOTH

We prove that neural networks with a single hidden layer are capable of providing an optimal order of approximation for functions assumed to possess a given number of derivatives, if the activation

Feedforward networks with one hidden layer and their rates of approximation

We present an overview of some rates of approximation with respect to various computational units in the hidden layer of a one-layered feedforward neural network. The problem of estimating the number
...

References

SHOWING 1-10 OF 22 REFERENCES

Universal approximation using feedforward networks with non-sigmoid hidden layer activation functions

TLDR
Multilayer feedforward networks possess universal approximation capabilities by virtue of the presence of intermediate layers with sufficiently many parallel processors; the properties of the intermediate-layer activation function are not so crucial.

There exists a neural network that does not make avoidable mistakes

  • A. GallantH. White
  • Computer Science
    IEEE 1988 International Conference on Neural Networks
  • 1988
The authors show that a multiple-input, single-output, single-hidden-layer feedforward network with (known) hardwired connections from input to hidden layer, monotone squashing at the hidden layer

Approximation by superpositions of a sigmoidal function

  • G. Cybenko
  • Computer Science
    Math. Control. Signals Syst.
  • 1989
In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functionals can uniformly approximate any continuous function ofn real

Nonlinear dynamics of artificial neural systems

TLDR
It is shown that there are many powerful techniques for reducing the number of spurious terms, and that the high order approach has many advantages over a cascaded slab approach in certain problem areas.

Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations

The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed by

On the Representation of Continuous Functions of Several Variables as Superpositions of Continuous Functions of one Variable and Addition

The aim of this paper is to present a brief proof of the following theorem: Theorem. For any integer n ≥ 2 there are continuous real functions ψ p q (x) on the closed unit interval E 1 = [0;1] such

Principles of mathematical analysis

Chapter 1: The Real and Complex Number Systems Introduction Ordered Sets Fields The Real Field The Extended Real Number System The Complex Field Euclidean Spaces Appendix Exercises Chapter 2: Basic

Theory of the Back Propagation Neural Network