Multilayer feedforward networks with a nonpolynomial activation function can approximate any function

@article{Leshno1991MultilayerFN,
  title={Multilayer feedforward networks with a nonpolynomial activation function can approximate any function},
  author={Moshe Leshno and Vladimir Ya. Lin and Allan Pinkus and Shimon Schocken},
  journal={Neural Networks},
  year={1991},
  volume={6},
  pages={861-867}
}
  • Moshe Leshno, Vladimir Ya. Lin, +1 author Shimon Schocken
  • Published in Neural Networks 1991
  • Mathematics, Computer Science
  • Several researchers characterized the activation functions under which multilayer feedforwardnetworks can act as universal approximators. We show that all the characterizationsthat were reported thus far in the literature ark special cases of the following general result:a standard multilayer feedforward network can approximate any continuous functionto any degree of accuracy if and only if the network's activation functions are not polynomial.We also emphasize the important role of the… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Figures and Topics from this paper.

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 638 CITATIONS

    On the Bias-Variance Tradeoff: Textbooks Need an Update

    VIEW 6 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results

    VIEW 7 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    Universal Approximation by Ridge Computational Models and Neural Networks: A Survey ⁄

    VIEW 10 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS
    HIGHLY INFLUENCED

    An adaptive higher-order neural networks (AHONN) and its approximation capabilities

    • Shuxiang Xu, Ming Zhang
    • Computer Science
    • Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.
    • 2002
    VIEW 4 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Neural Networks for Pattern Classification and Universal Approximation

    VIEW 4 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    1993
    2020

    CITATION STATISTICS

    • 33 Highly Influenced Citations

    • Averaged 63 Citations per year from 2017 through 2019

    • 58% Increase in citations per year in 2019 over 2018

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 19 REFERENCES

    Approximation capabilities of multilayer feedforward networks

    • Kurt Hornik
    • Mathematics, Computer Science
    • Neural Networks
    • 1991
    VIEW 2 EXCERPTS

    Approximation by superposifions of a sigmoidal function

    • G. Cybenko
    • Mathematics of Control , Signals , and Systems
    • 1989
    VIEW 1 EXCERPT

    Connectionist Learning Procedures

    VIEW 2 EXCERPTS

    On the approximate realization of continuous mappings by neural networks

    VIEW 2 EXCERPTS

    Capabilities of three-layered perceptrons

    VIEW 1 EXCERPT