• Publications
  • Influence
Learning internal representations by error propagation
This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion
Learning representations by back-propagating errors
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain. Expand
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed byExpand
An interactive activation model of context effects in letter perception: part 1.: an account of basic findings
A model of context effects in perception is applied to the perception of letters in various contexts and exhibits the perceptual advantage for letters in words over unrelated contexts and is consistent with the basic facts about the word advantage. Expand
Forward Models: Supervised Learning with a Distal Teacher
This article demonstrates that certain classical problems associated with the notion of the “teacher” in supervised learning can be solved by judicious use of learned internal models as components of the adaptive system. Expand
On learning the past-tenses of English verbs: implicit rules or parallel distributed processing
It is shown how rule-like behavior can emerge from the interactions among a network of units encoding the root form to past tense mapping, and how details of the acquisition process not captured by the rule account emerge. Expand
Toward an interactive model of reading.
This chapter adapts a formalism developed in the context of parallel computation to the specification of a model for reading and shows that such a model can account in a convenient way for those aspects of reading that appear puzzling in the contexts of more linear stage-oriented models. Expand
Publisher Summary The chapter discusses that the structure of stories is ordinarily more than pair wise relationships among sentences, and strings of sentences combine into psychological wholes. ItExpand
Feature discovery by competitive learning
This paper shows how a set of feature detectors which capture important aspects of the set of stimulus input patterns are discovered and how these feature detectors form the basis of a multilayer system that serves to learn categorizations of stimulus sets which are not linearly separable. Expand