Computational models of language universals: Expressiveness, learnability and consequences

@inproceedings{Edelman2011ComputationalMO,
  title={Computational models of language universals: Expressiveness, learnability and consequences},
  author={Shimon Edelman and E. Stabler},
  year={2011}
}
Every linguist is struck by similarities among even the most different and most culturally isolated human languages. It is natural to assume that some of these common properties, these language universals, might reflect something about the way people can learn and use languages. In some relevant sense, some of these properties may arise and be maintained even in culturally isolated languages because of special restrictions on the range of structural options available for human language learners… 
The language faculty that wasn't: a usage-based account of natural language recursion
TLDR
It is argued that a language faculty is difficult to reconcile with evolutionary considerations, and that the authors' ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on domain-general sequence learning abilities.
Productivity and Reuse in Language
TLDR
This thesis presents a formal model of productivity and reuse which treats the problem as a structure-by-structure inference in a Bayesian framework and is built around two proposals: that anything that can be computed can be stored and that any stored item can include subparts which must be computed productively.
On the role of locality in learning stress patterns*
Abstract This paper presents a previously unnoticed universal property of stress patterns in the world's languages: they are, for small neighbourhoods, neighbourhood-distinct.
Ìgbo transitivity in a derivational framework*
1. Lexical transitivity According to the theory of Universal Grammar (Chomsky 1965, 6), data which may appear typologically ‘exotic’ can in principle turn round to explain previously ‘familiar’
The VC dimension of constraint-based grammars
Multiple factors in second language acquisition: The CASP model
TLDR
The result is a broadly based theory of SLA, which can potentially solve some of the traditional puzzles in this field, e.g., involving when transfer from an L1 does and does not occur.
Two sides of the same slim Boojum: Further arguments for a lexical approach to argument structure
TLDR
This response focuses on those replies that are germane to the issue, while also addressing some other matters.

References

SHOWING 1-10 OF 105 REFERENCES
Formal Principles of Language Acquisition
TLDR
The authors of this book have developed a rigorous and unified theory that opens the study of language learnability to discoveries about the mechanisms of language acquisition in human beings and has important implications for linguistic theory, child language research, and the philosophy of language.
Natural Logic in Linguistic Theory
From Aristotle to Quine and Evans and many others, it is a familiar idea that certain semantic relations among sentences are determined by their "grammatical form" alone. That is, independent of
On the Compositional Extension Problem
TLDR
A different version of the compositional extension problem is solved, corresponding to another type of linguistic situation in which the authors only have a partial semantics, and without assuming the Husserl property.
On the generative power of transformational grammars
The convergence of mildly context-sensitive grammar formalisms
TLDR
This paper has described a formalism, the linear context-free rewriting system (LCFR), as a first attempt to capture the closeness of the derivation structures of these formalisms, and shown that LCFRs are equivalent to muticomponent tree adjoining grammars (MCTAGs), and also briefly discussed some variants of TAG.
Derivational Minimalism
TLDR
A simple grammar formalism with these properties is presented here and briefly explored and can define languages that are not in the class of languages definable by tree adjoining grammars.
Computational Complexity and Lexical Functional Grammar
  • R. Berwick
  • Linguistics, Computer Science
    Am. J. Comput. Linguistics
  • 1982
TLDR
It is shown that moderate ly restricted Transformat Grammars (TGs) can generate languages whose recognition time is provably exponential, while Rounds 1973,1975 extended this work by demonstrating that moderately restricted Trans format grammars can generate Languages whose recognitionTime is provable exponential.
Foundational issues in natural language processing
In four separate essays the authors address the complex and difficult connections among grammatical theory, mathematical linguistics, and the operation of real natural-language-processing systems,
Strict Compositionality and Literal Movement Grammars
TLDR
It is argued that given this strict principle, quite powerful string handling mechanisms must be assumed, and Linear Context Free Rewrite Systems are not enough to generate human languages, but most likely Literal Movement Grammars will do.
...
...