Corpus ID: 238198566

Expectation-based Minimalist Grammars

@article{Chesi2021ExpectationbasedMG,
  title={Expectation-based Minimalist Grammars},
  author={Cristiano Chesi},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.13871}
}
  • C. Chesi
  • Published 28 September 2021
  • Computer Science
  • ArXiv
Expectation-based Minimalist Grammars (e-MGs) are simplified versions of the (Conflated) Minimalist Grammars, (C)MGs, formalized by Stabler (Stabler, 2011, 2013, 1997) and Phase-based Minimalist Grammars, PMGs (Chesi, 2005, 2007; Stabler, 2011). The crucial simplification consists of driving structure building only by relying on lexically encoded categorial top-down expectations. The commitment on a top-down derivation (as in e-MGs and PMGs, as opposed to (C)MGs, Chomsky, 1995; Stabler, 2011… Expand

References

SHOWING 1-10 OF 43 REFERENCES
An introduction to Phase-based Minimalist Grammars: why move is Top-Down from Left-to-Right
TLDR
It is argued that long distance dependencies, such as successive cyclic A'-movement, are better understood within this unconventional (at least within the Minimalist Program) phase-based directional perspective. Expand
Phases and Complexity in Phrase Structure Building
TLDR
It is proposed that a Minimalist Grammar formalization can be used both in parsing and in generation if the authors re-orient the directionality of the Structure Building Operations (merge and move) and if the notion of phase is formalized. Expand
Two Models of Minimalist, Incremental Syntactic Analysis
  • E. Stabler
  • Computer Science, Medicine
  • Top. Cogn. Sci.
  • 2013
TLDR
Although the determinants of MG structure are narrowly and discretely defined, probabilistic influences from a much broader domain can influence even the earliest analytic steps, allowing frequency and context effects to come early and from almost anywhere, as expected in incremental models. Expand
On Directionality of Phrase Structure Building
  • C. Chesi
  • Medicine, Computer Science
  • Journal of psycholinguistic research
  • 2015
TLDR
It is suggested that by looking at the elementary restrictions that apply to Merge, it could be concluded that a re-orientation of the syntactic derivation is necessary to make the theory simpler, especially for long-distance (filler-gap) dependencies, and is also empirically more adequate. Expand
Derivational Minimalism Is Mildly Context-Sensitive
TLDR
It is shown that this type of a minimalist grammar constitutes a subclass of mildly context-sensitive grammars in the sense that for each MG there is a weakly equivalent linear context-free rewriting system (LCFRS). Expand
MG Parsing as a Model of Gradient Acceptability in Syntactic Islands
TLDR
This paper proposes the use of a top-down parser for Minimalist Grammars, as a formal model of how gradient acceptability can arise from categorical grammars. Expand
Derivational Minimalism
TLDR
A simple grammar formalism with these properties is presented here and briefly explored and can define languages that are not in the class of languages definable by tree adjoining grammars. Expand
Computational Perspectives on Minimalism
While research in ‘principles and parameters’ tradition [18] can be regarded as attributing as much as possible to universal grammar (UG) in order to understand how language acquisition is possible,Expand
Relative clauses as a benchmark for Minimalist parsing
TLDR
It is shown that among those 1600 candidates, a few metrics (and only a few) can provide a unified account of all these contrasts, which significantly limits the number of viable metrics that may be applied to other phenomena, thus reducing theoretical indeterminacy. Expand
Memory Resource Allocation in Top-Down Minimalist Parsing
TLDR
This paper examines the transient stack states of a top-down parser for Minimalist Grammars as it analyzes embedded sentences in English, Dutch and German, and finds that the number of time steps that a derivation tree node persist on the parser's stack derives the observed contrasts in English center embedding, and the difference between German and Dutch embedding. Expand
...
1
2
3
4
5
...