Multilayer Perceptron Algebra
@article{Peng2017MultilayerPA, title={Multilayer Perceptron Algebra}, author={Zhao Peng}, journal={ArXiv}, year={2017}, volume={abs/1701.04968} }
Artificial Neural Networks(ANN) has been phenomenally successful on various pattern recognition tasks. However, the design of neural networks rely heavily on the experience and intuitions of individual developers. In this article, the author introduces a mathematical structure called MLP algebra on the set of all Multilayer Perceptron Neural Networks(MLP), which can serve as a guiding principle to build MLPs accommodating to the particular data sets, and to build complex MLPs from simpler ones.
2 Citations
Deep learning in electron microscopy
- Computer ScienceMach. Learn. Sci. Technol.
- 2021
This review paper offers a practical perspective aimed at developers with limited familiarity of deep learning in electron microscopy that discusses hardware and software needed to get started with deep learning and interface with electron microscopes.
Convolutional Neural Network for Centrality Determination in Fixed Target Experiments
- PhysicsPhysics of Particles and Nuclei
- 2020
Fixed target experiments have an unique possibility to measure centrality of colliding systems by hadronic calorimeters on the beam line. This is usually achieved by the detection of all forward…
References
SHOWING 1-6 OF 6 REFERENCES
Neural Networks and Deep Learning
- Computer ScienceSpringer International Publishing
- 2018
This chapter explains a fast algorithm for computing such gradients, an algorithm known as backpropagation, which is based on the model known as reinforcement learning.
Big Neural Networks Waste Capacity
- Computer ScienceICLR
- 2013
The experiments on ImageNet LSVRC-2010 show that this may be due to the fact there are highly diminishing returns for capacity in terms of training error, leading to underfitting, suggesting that the optimization method - first order gradient descent - fails at this regime.
Approximation by superpositions of a sigmoidal function
- Computer ScienceMath. Control. Signals Syst.
- 1989
In this paper we demonstrate that finite linear combinations of compositions of a fixed, univariate function and a set of affine functionals can uniformly approximate any continuous function ofn real…
Deep Learning
- Computer ScienceNature
- 2015
Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Do Deep Nets Really Need to be Deep?
- Computer ScienceNIPS
- 2014
This paper empirically demonstrate that shallow feed-forward nets can learn the complex functions previously learned by deep nets and achieve accuracies previously only achievable with deep models.
Do Deep Convolutional Nets Really Need to be Deep and Convolutional?
- Computer Science, GeologyICLR
- 2017
Yes, they do. This paper provides the first empirical demonstration that deep convolutional models really need to be both deep and convolutional, even when trained with methods such as distillation…