A class of instantaneously trained neural networks

  title={A class of instantaneously trained neural networks},
  author={Subhash C. Kak},
  journal={Inf. Sci.},
  • S. Kak
  • Published 1 December 2002
  • Computer Science
  • Inf. Sci.

Figures from this paper

Training of CC4 Neural Network with Spread Unary Coding

The modified CC4 algorithm is adapted to train the neural networks using spread unary inputs and it is shown that the number of misclassified points is not particularly sensitive to the chosen radius of generalization.

Hybrid Neural Network Architecture for On-Line Learning

This paper proposes a more realistic biologically inspired hybrid neural network architecture that uses two kinds of neural networks simultaneously to consider short-term and long-term characteristics of the signal.

The Basic Kak Neural Network with Complex Inputs

This introduction to the basic Kak network with complex inputs is being presented, which is part of a larger hierarchy of learning schemes that include quantum models.

Fast Learning Neural Network Using Modified Corners Algorithm

This paper uses a different type of modeling to represent data and hence solve the problem of fast learning and has taken the help of distance separation of training data and an unknown input to calculate the most probable output in the neural network.

Instantaneously trained neural networks with complex inputs

The development of the 3C algorithm is the main contribution of the thesis, which adapts the time-efficient corner classification approach to train feedforward neural networks to handle complex inputs using prescriptive learning, where the network weights are assigned simply upon examining the inputs.

Research and application of a neural network classifier based on dynamic threshold

The comparison results with the MLP method show that the MLBP classifier model achieves more satisfactory results, and it is more reliable and effective to solve the problem.

Variable Threshold Neurons Increase Capacity of a Feedback Network

The article presents new results on the use of variable thresholds to increase the capacity of a feedback neural network and the experimental results of using this approach over different size of networks are presented.

Delta Learning Rule for the Active Sites Model

The recently proposed Active Sites model is extended by developing a delta rule to increase memory capacity and the binary neural network is extended to a multi-level (non-binary) neural network.

The NoN Approach to Autonomic Face Recognition

A method of autonomic face recognition based on the biologically plausible network of networks (NoN) model of information processing is presented, which models the structures in the cerebral cortex described by Mountcastle and the architecture based on that proposed for information processing by Sutton.

Memory Capacity of Neural Networks using a Circulant Weight Matrix

Children are capable of learning soon after birth which indicates that the neural networks of the brain have prior learnt capacity that is a consequence of the regular structures in the brain's organization, so the capacity of circulant matrices as weight matrices in a feedback network is considered.



Instantaneous Learning Neural Networks.

This talk discusses several types of neural networks with the instantaneous learning property, including the CC4 Corner Classification neural network, and some comparison of generalization performance is presented.

On Generalization by Neural Networks

  • S. Kak
  • Computer Science
    Inf. Sci.
  • 1998

New algorithms for training feedforward neural networks

  • S. Kak
  • Computer Science
    Pattern Recognit. Lett.
  • 1994

On training feedforward neural networks

Another corner classification algorithm presented in this paper does not require any computations to find the weights and in its basic form it does not perform generalization.

A new corner classification approach to neural network training

The corner classification approach to neural network training has the excellent capability ofprescriptive learning, where the network weights areprescribed merely by inspection of the training

Better Web Searches and Prediction with Instantaneously Trained Neural Networks

New neural network designs that model working memory in their ability to learn and generalize instantaneously are developed and their applications to two problems: time-series prediction and an intelligent Web metasearch engine design are described.

Implementing Kak Neural Networks on a Reconfigurable Computing Platform

This paper shows that the Kak algorithm is hardware friendly and is especially suited for implementation in reconfigurable computing using fine grained parallelism and demonstrates that on-line learning with the algorithm is possible through dynamic evolution of the topology of a Kak neural network.

Learning as self-organization

This book discusses the development of self-Organization and the Social Collective in the context of postnatal development, as well as an exploration of the Neural Bases of Memory Representations of Reward and Context.

The Three Languages of the Brain: Quantum, Reorganizational, and Associative

It is being recognized that stimulus-response constructs such as “drive” are often inadequate in providing explanations; and one invokes the category “effort” to explain autonomous behavior.

The information sciences

Through a human centered design project focused on an information science problem, students will gain experience and a better understanding of the process to develop an innovative solution addressing a societal need.