Who Is the Father of Deep Learning?

@article{Tappert2019WhoIT,
  title={Who Is the Father of Deep Learning?},
  author={Charles C. Tappert},
  journal={2019 International Conference on Computational Science and Computational Intelligence (CSCI)},
  year={2019},
  pages={343-348}
}
  • C. Tappert
  • Published 1 December 2019
  • Computer Science
  • 2019 International Conference on Computational Science and Computational Intelligence (CSCI)
This paper evaluates candidates for the father of deep learning. We conclude that Frank Rosenblatt developed and explored all the basic ingredients of the deep learning systems of today, and that he should be recognized as a Father of Deep Learning, perhaps together with Hinton, LeCun and Bengio who have just received the Turing Award as the fathers of the deep learning revolution. 

A Deep-Network Piecewise Linear Approximation Formula

  • G. Zeng
  • Computer Science
    IEEE Access
  • 2021
TLDR
An explicit architecture of a universal deep network is developed by using the Gray code order and an explicit formula is developed for the weights of this deep network that is target function independent and gives the same result as the shallow piecewise linear interpolation function for an arbitrary target function.

Application of Long Short-Term Memory Recurrent Neural Networks Based on the BAT-MCS for Binary-State Network Approximated Time-Dependent Reliability Problems

  • W. Yeh
  • Computer Science
    ArXiv
  • 2022
TLDR
A new algorithm called the LSTM-BAT-MCS, based on long short-term memory (LSTM), the Monte Carlo simulation (MCS), and the binary-adaption-tree algorithm (BAT), is proposed, which demonstrates the superiority of the proposed algorithm by experimental results of three benchmark networks.

Application of the Deep CNN-Based Method in Industrial System for Wire Marking Identification

TLDR
An in-depth description of the underlying methodology of this device, its construction, and foremostly, the assembly industrial processes, through which this device is implemented, and the advantages and challenges of the device are presented.

References

SHOWING 1-10 OF 19 REFERENCES

Learning representations by back-propagating errors

TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.

Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations

The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed by

ImageNet classification with deep convolutional neural networks

TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.

The legacy of Donald O. Hebb: more than the Hebb Synapse

TLDR
The work of Donald O. Hebb is reviewed and its lasting influence on neuroscience is reviewed in honour of the 2004 centenary of his birth.

Artificial Intelligence for Advanced Human-Machine Symbiosis

TLDR
Augmented cognition seeks to understand the current state-of-the-art of AI and how it may be best applied for advancing human-machine symbiosis.

Perceptrons - an introduction to computational geometry

TLDR
The aim of this book is to seek general results from the close study of abstract version of devices known as perceptrons.

Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models

  • G. Kane
  • Computer Science, History
  • 1994
TLDR
Artificial neural network research began in the early 1940s, advancing in fits and starts, until the late 1960s when Minsky and Papert published Perceptrons, in which they proved that neural networks, as then conceived, can be proved.

Receptive fields of single neurones in the cat's striate cortex

TLDR
The present investigation, made in acute preparations, includes a study of receptive fields of cells in the cat's striate cortex, which resembled retinal ganglion-cell receptive fields, but the shape and arrangement of excitatory and inhibitory areas differed strikingly from the concentric pattern found in retinalganglion cells.

Principles of neurodynamics