Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks

@article{Hornik1990UniversalAO,
  title={Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks},
  author={Kurt Hornik and Maxwell B. Stinchcombe and Halbert L. White},
  journal={Neural Networks},
  year={1990},
  volume={3},
  pages={551-560}
}

Figures from this paper

Application of an artificial neural network to the control of an active external orthosis of the lower limb
  • D. Guiraud
  • Engineering
    Medical and Biological Engineering and Computing
  • 2006
TLDR
A real-time application of an artificial neural network for motorised orthosis with six degrees-of-freedom for use by a paraplegic; a ‘walking machine’ is presented.
Extension of approximation capability of three layered neural networks to derivatives
  • Y. Ito
  • Computer Science
    IEEE International Conference on Neural Networks
  • 1993
TLDR
The author considers the problem of approximating arbitrary differentiable functions defined on compact sets of R/sup d/, as well as their derivatives, by finite sums of the form a/sub 0/+ Sigma /sub i=1//sup p/ a/ sub i/g(W/sub i/*x+b), where g is an arbitrary nonpolynomial C/sup infinity /-function fixed beforehand.
Overcoming The Limitations of Neural Networks in Composite-Pattern Learning with Architopes
TLDR
It is demonstrated that the feed-forward architecture, for most commonly used activation functions, is incapable of approximating functions comprised of multiple sub-patterns while simultaneously respecting their composite-pattern structure, so a simple architecture modification is implemented that reallocates the neurons of any singleFeed-forward network across several smaller sub-networks, each specialized on a distinct part of the input-space.
Orthogonal least squares algorithm for the approximation of a map and its derivatives with a RBF network
Approximation of Curves Contained on the Surface by Freed-Forward Neural Networks
TLDR
Based on Freed-forward Neural Networks, a new method to approximate curves contained on the given surface is developed to convert the problems of space curve approximation on surfaces into the plane curve approximation by point projection.
Differentiating Functions of the Jacobian with Respect to the Weights
TLDR
The J-prop algorithm is introduced, an efficient general method for computing the exact partial derivatives of a variety of simple functions of the Jacobian of a model with respect to its free parameters.
Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives
TLDR
This work extends Barron's results to feedforward networks with possibly nonsigmoid activation functions approximating mappings and their derivatives simultaneously, showing that the approximation error decreases at rates as fast as n1/2, where n is the number of hidden units.
...
...

References

SHOWING 1-10 OF 42 REFERENCES
Universal approximation using feedforward networks with non-sigmoid hidden layer activation functions
TLDR
Multilayer feedforward networks possess universal approximation capabilities by virtue of the presence of intermediate layers with sufficiently many parallel processors; the properties of the intermediate-layer activation function are not so crucial.
Generic constraints on underspecified target trajectories
  • Michael I. Jordan
  • Computer Science
    International 1989 Joint Conference on Neural Networks
  • 1989
TLDR
The author discusses a third possibility in which domain-specific knowledge is incorporated directly in a network learning rule via a set of constraints on activations, which uses the notion of a forward model to give constraints a domain- specific interpretation.
A general explanation and interrogation system for neural networks
TLDR
Techniques are presented for converting the information in a trained network into symbolic form as a set of rules and for obtaining explanations from the network for specific inputs to provide the neurocomputer with one advantage of expert systems while retaining the learning and generalization capability of the neural network.
Multilayer feedforward networks are universal approximators
Nonlinear signal processing using neural networks: Prediction and system modelling
TLDR
It is demonstrated that the backpropagation learning algorithm for neural networks may be used to predict points in a highly chaotic time series with orders of magnitude increase in accuracy over conventional methods including the Linear Predictive Method and the Gabor-Volterra-Weiner Polynomial Method.
Hilbert Space Methods for Partial Differential Equations
i Preface This book is an outgrowth of a course which we have given almost periodically over the last eight years. It is addressed to beginning graduate students of mathematics, engineering, and the
An elasticity can be estimated consistently without a priori knowledge of functional form
We consider an open question in applied price theory: Without a priori knowledge of a firm's cost function or a consumer's indirect utility function, is it possible to estimate price and substitution
Non-parametric estimation of econometric functionals
In this paper, the author reviews and explores the nonparametric density estimation approach for analyzing various econometric func tionals. The applications of density estimation are emphasized in
...
...