Who Is the Father of Deep Learning?

@article{Tappert2019WhoIT,
  title={Who Is the Father of Deep Learning?},
  author={Charles C. Tappert},
  journal={2019 International Conference on Computational Science and Computational Intelligence (CSCI)},
  year={2019},
  pages={343-348}
}
  • C. Tappert
  • Published 2019
  • Psychology
  • 2019 International Conference on Computational Science and Computational Intelligence (CSCI)
This paper evaluates candidates for the father of deep learning. We conclude that Frank Rosenblatt developed and explored all the basic ingredients of the deep learning systems of today, and that he should be recognized as a Father of Deep Learning, perhaps together with Hinton, LeCun and Bengio who have just received the Turing Award as the fathers of the deep learning revolution. 
A Deep-Network Piecewise Linear Approximation Formula
  • G. Zeng
  • Medicine, Computer Science
  • IEEE Access
  • 2021
TLDR
An explicit architecture of a universal deep network is developed by using the Gray code order and an explicit formula is developed for the weights of this deep network that is target function independent and gives the same result as the shallow piecewise linear interpolation function for an arbitrary target function. Expand
A brief history of AI: how to prevent another winter (a critical review)
TLDR
A brief rundown of artificial intelligence's evolution is provided, highlighting its crucial moments and major turning points from inception to the present and attempting to learn, anticipate the future, and discuss what steps may be taken to prevent another winter. Expand
Application of the Deep CNN-Based Method in Industrial System for Wire Marking Identification
Industry 4.0, a term invented by Wolfgang Wahlster in Germany, is celebrating its 10th anniversary in 2021. Still, the digitalization of the production environment is one of the hottest topics in theExpand

References

SHOWING 1-10 OF 19 REFERENCES
Learning representations by back-propagating errors
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain. Expand
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed byExpand
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective. Expand
The legacy of Donald O. Hebb: more than the Hebb Synapse
TLDR
The work of Donald O. Hebb is reviewed and its lasting influence on neuroscience is reviewed in honour of the 2004 centenary of his birth. Expand
Artificial Intelligence for Advanced Human-Machine Symbiosis
TLDR
Augmented cognition seeks to understand the current state-of-the-art of AI and how it may be best applied for advancing human-machine symbiosis. Expand
Perceptrons - an introduction to computational geometry
TLDR
The aim of this book is to seek general results from the close study of abstract version of devices known as perceptrons. Expand
Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models
TLDR
Artificial neural network research began in the early 1940s, advancing in fits and starts, until the late 1960s when Minsky and Papert published Perceptrons, in which they proved that neural networks, as then conceived, can be proved. Expand
Receptive fields of single neurones in the cat's striate cortex
TLDR
The present investigation, made in acute preparations, includes a study of receptive fields of cells in the cat's striate cortex, which resembled retinal ganglion-cell receptive fields, but the shape and arrangement of excitatory and inhibitory areas differed strikingly from the concentric pattern found in retinalganglion cells. Expand
Principles of neurodynamics
Asia House
  • Asia House Asian Business Leader
  • 2018
...
1
2
...