Synthesis of feedforward networks in supremum error bound

@article{Ciesielski2000SynthesisOF,
  title={Synthesis of feedforward networks in supremum error bound},
  author={Krzysztof Chris Ciesielski and Jaroslaw P. Sacha and Krzysztof J. Cios},
  journal={IEEE transactions on neural networks},
  year={2000},
  volume={11 6},
  pages={
          1213-27
        }
}
The main result of this paper is a constructive proof of a formula for the upper bound of the approximation error in Linfinity (supremum norm) of multidimensional functions by feedforward networks with one hidden layer of sigmoidal units and a linear output. This result is applied to formulate a new method of neural-network synthesis. The result can also be used to estimate complexity of the maximum-error network and/or to initialize that network weights. An example of the network synthesis is… 
Characterization of Degree of Approximation for Neural Networks with One Hidden Layer
TLDR
By establishing both upper and lower bound estimations on degree of approximation, the essential approximation ability of a class of FNNs is clarified in terms of the modulus of smoothness of functions to be approximated.
ON THE GREEDY RADIAL BASIS FUNCTION NEURAL NETWORKS FOR APPROXIMATION MULTIDIMENSIONAL FUNCTIONS
The aim of this paper is to approximate multidimensional functions ) ( C s f R  by using the type of Feedforward neural networks (FFNNs) which is called Greedy radial basis function neural networks
APPROXIMATION OF MULTIDIMENSIONAL FUNCTIONS BY RADON RADIAL BASIS NEURAL NETWORKS
TLDR
A new method to approximate multidimensional function by using Radial Basis Neural Network with application of Radon Transform, and its inverse, to reduce the dimension of the space is presented.
Approximation of Green Warranty Function by Radon Radial Basis Function Network
TLDR
A new method to approximate a bivariate warranty function by using Radial Basis Function Network with application of Radon Transform and its inverse which is used to reduce the dimension of the warranty space is presented.
PRE-DIAGNOSIS OF LUNG CANCER USING FEED FORWARD NEURAL NETWORK AND BACK PROPAGATION ALGORITHM
TLDR
The aim of the paper is to propose a model for early detection and correct diagnosis of the disease which will help the doctor in saving the life of the patient.
Detection and discrimination between unbalanced supply and phase loss in PMSM using ANN-based protection scheme
TLDR
The presented approach gives high degree of accuracy in detection and diagnosis of phase loss fault and those due to supply voltages unbalance using artificial neural network, proven effectively through experimental validation.
The combination of a histogram-based clustering algorithm and support vector machine for the diagnosis of osteoporosis
TLDR
An automatic approach utilizing a histogram-based automatic clustering (HAC) algorithm with a support vector machine (SVM) to analyse dental panoramic radiographs (DPRs) to improve diagnostic accuracy by identifying postmenopausal women with low BMD or osteoporosis is presented.

References

SHOWING 1-10 OF 36 REFERENCES
On the approximate realization of continuous mappings by neural networks
Approximating Functions by Neural Networks: A Constructive Solution in the Uniform Norm
Construction of neural nets using the radon transform
The authors present a method for constructing a feedforward neural net implementing an arbitrarily good approximation to any L/sub 2/ function over (-1, 1)/sup n/. The net uses n input nodes, a
Universal approximation bounds for superpositions of a sigmoidal function
  • A. Barron
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1993
TLDR
The approximation rate and the parsimony of the parameterization of the networks are shown to be advantageous in high-dimensional settings and the integrated squared approximation error cannot be made smaller than order 1/n/sup 2/d/ uniformly for functions satisfying the same smoothness assumption.
Approximation by superpositions of a sigmoidal function
  • G. Cybenko
  • Computer Science
    Math. Control. Signals Syst.
  • 1989
In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functionals can uniformly approximate any continuous function ofn real
An Integral Representation of Functions Using Three-layered Networks and Their Approximation Bounds
  • N. Murata
  • Computer Science, Mathematics
    Neural Networks
  • 1996
A universal mapping for kolmogorov's superposition theorem
Approximation theory and feedforward networks
...
1
2
3
4
...