Optimality: From Neural Networks to Universal Grammar

@article{Prince1997OptimalityFN,
  title={Optimality: From Neural Networks to Universal Grammar},
  author={Alan Prince and Paul Smolensky},
  journal={Science},
  year={1997},
  volume={275},
  pages={1604 - 1610}
}
Can concepts from the theory of neural computation contribute to formal theories of the mind? Recent research has explored the implications of one principle of neural computation, optimization, for the theory of grammar. Optimization over symbolic linguistic structures provides the core of a new grammatical architecture, optimality theory. The proposition that grammaticality equals optimality sheds light on a wide range of phenomena, from the gulf between production and comprehension in child… 

Linguistic and Cognitive Explanation in Optimality Theory

Generative linguistics aims to provide an analysis of the grammar-forming capacity that individuals bring to the task of learning their native language (Chomsky 1965, 1981, 1991, 1995). Pursuing this

Computational and evolutionary aspects of language

TLDR
Understanding how darwinian evolution gives rise to human language requires the integration of formal language theory, learning theory and evolutionary dynamics.

The hierarchical prediction network: Towards a neural theory of grammar acquisition

TLDR
It is argued that the formation of topologies, that occurs in the learning process of HPN, offers a neurally plausible explanation for the categorization and abstraction process in general.

Unifying syntactic theory and sentence processing difficulty through a connectionist minimalist parser

TLDR
A parser for Stabler's Minimalist Grammars is presented, in the framework of Smolensky’s Integrated Connectionist/Symbolic architectures, and it is demonstrated that the connectionist minimalist parser produces predictions which mirror global empirical findings from psycholinguistic research.

Linguistic Optimization ∗

Optimality Theory (OT) is a model of language that combines aspects of generative and connectionist linguistics. It is unique in the field in its use of a rank ordering on constraints, which is used

Learning biases predict a word order universal

The evolutionary dynamics of grammar acquisition.

TLDR
A mathematical theory is designed that places the problem of language acquisition into an evolutionary context and specifies the conditions for the evolution of universal grammar, and calculates the maximum size of the search space that is compatible with coherent communication in a population.

Grammar-based connectionist approaches to language

Formal Grammars of Early Language

TLDR
This approach provides a testbed for evaluating theories of language acquisition, in particular with respect to the extent to which innate, language-specific mechanisms must be assumed.

Expressive and Interpretive Optimization

TLDR
This chapter is an outline of Optimality Theory (OT) as a model of grammar, with stochastic extensions for language variation and language change.
...

References

SHOWING 1-10 OF 29 REFERENCES

On the proper treatment of connectionism

Abstract A set of hypotheses is formulated for a connectionist approach to cognitive modeling. These hypotheses are shown to be incompatible with the hypotheses underlying traditional cognitive

The Sound Pattern of English

Since this classic work in phonology was published in 1968, there has been no other book that gives as broad a view of the subject, combining generally applicable theoretical contributions with

The Learnability of Optimality Theory: An Algorithm and Some Basic Complexity Results

TLDR
The learnability of grammars in Optimality Theory is investigated, and a simple and efficient algorithm is presented for solving the special nature of the learning problem in that theory.

Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations

The fundamental principles, basic mechanisms, and formal analyses involved in the development of parallel distributed processing (PDP) systems are presented in individual chapters contributed by

Networks and Theories: The Place of Connectionism in Cognitive Science

TLDR
It is argued that connectionist networks should not be thought of as theories or simulations of theories, but may nevertheless contribute to the development of theories.

Neural networks and physical systems with emergent collective computational abilities.

  • J. Hopfield
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 1982
TLDR
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.

Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models

  • G. Kane
  • Computer Science, History
  • 1994
TLDR
Artificial neural network research began in the early 1940s, advancing in fits and starts, until the late 1960s when Minsky and Papert published Perceptrons, in which they proved that neural networks, as then conceived, can be proved.

Mathematical Perspectives on Neural Networks

TLDR
P. Smolensky, Overview: Computational, Dynamical, and Statistical Perspectives on the Processing and Learning Problems in Neural Network Theory.

Second Commentary: On the proper treatment of connectionism by Paul Smolensky (1988) - Neuromachismo Rekindled

Author(s): Freeman, Walter J, III | Abstract: Paul Smolensky's marvelous neologisitc epithet "neuromacho" should not be allowed to drift into oblivion. Hence I come forth under the banner of