Who Is the Father of Deep Learning?

  title={Who Is the Father of Deep Learning?},
  author={Charles C. Tappert},
  journal={2019 International Conference on Computational Science and Computational Intelligence (CSCI)},
  • C. Tappert
  • Published 1 December 2019
  • Computer Science
  • 2019 International Conference on Computational Science and Computational Intelligence (CSCI)
This paper evaluates candidates for the father of deep learning. We conclude that Frank Rosenblatt developed and explored all the basic ingredients of the deep learning systems of today, and that he should be recognized as a Father of Deep Learning, perhaps together with Hinton, LeCun and Bengio who have just received the Turing Award as the fathers of the deep learning revolution. 

A Deep-Network Piecewise Linear Approximation Formula

  • G. Zeng
  • Computer Science
    IEEE Access
  • 2021
An explicit architecture of a universal deep network is developed by using the Gray code order and an explicit formula is developed for the weights of this deep network that is target function independent and gives the same result as the shallow piecewise linear interpolation function for an arbitrary target function.

Application of Long Short-Term Memory Recurrent Neural Networks Based on the BAT-MCS for Binary-State Network Approximated Time-Dependent Reliability Problems

  • W. Yeh
  • Computer Science
  • 2022
A new algorithm called the LSTM-BAT-MCS, based on long short-term memory (LSTM), the Monte Carlo simulation (MCS), and the binary-adaption-tree algorithm (BAT), is proposed, which demonstrates the superiority of the proposed algorithm by experimental results of three benchmark networks.

Application of the Deep CNN-Based Method in Industrial System for Wire Marking Identification

An in-depth description of the underlying methodology of this device, its construction, and foremostly, the assembly industrial processes, through which this device is implemented, and the advantages and challenges of the device are presented.



Learning representations by back-propagating errors

Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.

Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations

The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed by

ImageNet classification with deep convolutional neural networks

A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.

The legacy of Donald O. Hebb: more than the Hebb Synapse

The work of Donald O. Hebb is reviewed and its lasting influence on neuroscience is reviewed in honour of the 2004 centenary of his birth.

Artificial Intelligence for Advanced Human-Machine Symbiosis

Augmented cognition seeks to understand the current state-of-the-art of AI and how it may be best applied for advancing human-machine symbiosis.

Perceptrons - an introduction to computational geometry

The aim of this book is to seek general results from the close study of abstract version of devices known as perceptrons.

Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models

  • G. Kane
  • Computer Science, History
  • 1994
Artificial neural network research began in the early 1940s, advancing in fits and starts, until the late 1960s when Minsky and Papert published Perceptrons, in which they proved that neural networks, as then conceived, can be proved.

Receptive fields of single neurones in the cat's striate cortex

The present investigation, made in acute preparations, includes a study of receptive fields of cells in the cat's striate cortex, which resembled retinal ganglion-cell receptive fields, but the shape and arrangement of excitatory and inhibitory areas differed strikingly from the concentric pattern found in retinalganglion cells.

Principles of neurodynamics