Neural Network with Unbounded Activations is Universal Approximator

@article{Sonoda2015NeuralNW,
  title={Neural Network with Unbounded Activations is Universal Approximator},
  author={Sho Sonoda and N. Murata},
  journal={ArXiv},
  year={2015},
  volume={abs/1505.03654}
}
  • Sho Sonoda, N. Murata
  • Published 2015
  • Computer Science, Mathematics
  • ArXiv
  • This paper investigates the approximation property of the neural network with unbounded activation functions, such as the rectied linear unit (ReLU), which is new de-facto standard of deep learning. The ReLU network can be analyzed by the ridgelet transform with respect to Lizorkin distributions, which is introduced in this paper. By showing two reconstruction formulas by using the Fourier slice theorem and the Radon transform, it is shown that the neural network with unbounded activations… CONTINUE READING
    Self-Adaptive System Architecture Modeling Based on DoDAF
    Global regime shift dynamics of catastrophic sea urchin overgrazing
    233
    Fast generalization error bound of deep learning without scale invariance of activation functions
    2
    A Minimax Approach for Classification with Big-data

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 65 REFERENCES
    Sorveglianza nazionale delle infezioni in terapia intensiva (Progetto SITIN)
    • 2014
    Unrealistic optimism about future life events
    3698
    Localization of multiple emitters based on the sequential PHD filter
    20
    The Ridgelet transform of distributions
    11
    The Calderón reproducing formula
    • 1998
    Théorie des distributions
    2396