The syntactic concept lattice: Another algebraic theory of the context-free languages?

@article{Clark2015TheSC,
  title={The syntactic concept lattice: Another algebraic theory of the context-free languages?},
  author={Alexander Clark},
  journal={J. Log. Comput.},
  year={2015},
  volume={25},
  pages={1203-1229}
}
  • Alexander Clark
  • Published 1 October 2015
  • Mathematics, Computer Science
  • J. Log. Comput.
The syntactic concept lattice is a residuated lattice associated with a given formal language; it arises naturally as a generalisation of the syntactic monoid in the analysis of the distributional structure of the language. In this paper we define the syntactic concept lattice and present its basic properties, and its relationship to the universal automaton and the syntactic congruence; we consider several different equivalent definitions, as Galois connections, as maximal factorisations and… Expand
Language-Theoretic and Finite Relation Models for the (Full) Lambek Calculus
  • C. Wurm
  • Mathematics, Computer Science
  • J. Log. Lang. Inf.
  • 2017
TLDR
A new semantics is presented, which combines languages and relations via closure operators which are based on automaton transitions, and is established via an isomorphism theorem for the syntactic concepts lattice of a language and a construction for the universal automaton recognizing the same language. Expand
Automatic Concepts and Automata-Theoretic Semantics for the Full Lambek Calculus
TLDR
A new semantics for the full Lambek calculus, which is based on an automata-theoretic construction, which establishes a strong relation between two canonical constructions over a given language, namely its syntactic concept lattice and its universal automaton. Expand
An Algebraic Approach to Multiple Context-Free Grammars
TLDR
This work defines an algebraic structure, Paired Complete Idempotent Semirings (pcis), which are appropriate for defining a denotational semantics for multiple context-free grammars of dimension 2 (2-mcfg), and shows that this lattice is the unique minimal structure that will interpret the grammar faithfully. Expand
Canonical Context-Free Grammars and Strong Learning: Two Approaches
TLDR
This work presents two different classes of canonical context-free grammars, one based on all of the primes in the lattice; the other, more suitable for strong learning algorithms is based on a subset of primes that are irreducible in a certain sense. Expand
Learning trees from strings: a strong learning algorithm for some context-free grammars
TLDR
This work takes as its starting point a simple learning algorithm for substitutable context-free languages, based on principles of distributional learning, and modify it so that it will converge to a canonical grammar for each language. Expand
Distributional learning of conjunctive grammars and contextual binary feature grammars
  • Ryo Yoshinaka
  • Mathematics, Computer Science
  • J. Comput. Syst. Sci.
  • 2019
TLDR
This paper presents a distributional learning algorithm for conjunctive grammars with the k -finite context property ( k - fcp) for each natural number k and shows that every exact cbfg has the k- fcp, while not all of them are learnable by their algorithm. Expand
On Some Extensions of Syntactic Concept Lattices: Completeness and Finiteness Results
  • C. Wurm
  • Mathematics, Computer Science
  • FG
  • 2015
We provide some additional completeness results for the full Lambek calculus and syntactic concept lattices, where the underlying structure is extended to tuples of arbitrary finite and infiniteExpand
Distributional Learning as a Theory of Language Acquisition
TLDR
It is argued that all of the pieces are in place for a complete and explanatory theory of language acquisition based on distributional learning and some of the nontrivial predictions of this theory about the syntax and syntax-semantics interface are sketched. Expand
Computational Learning of Syntax
TLDR
The computational issues involved in learning hierarchically structured grammars from strings of symbols alone are discussed and methods based on an abstract notion of the derivational context of a syntactic category lead to learning algorithms based on a form of traditional distributional analysis. Expand
A Hierarchy of Context-Free Languages Learnable from Positive Data and Membership Queries
We consider a generalization of the “dual” approach to distributional learning of contextfree grammars, where each nonterminal A is associated with a string set XA characterized by a finite set C ofExpand
...
1
2
...

References

SHOWING 1-10 OF 45 REFERENCES
Learning Context Free Grammars with the Syntactic Concept Lattice
TLDR
This work presents a learning algorithm for context free grammars which uses positive data and membership queries, and proves its correctness under the identification in the limit paradigm. Expand
Logical Grammars, Logical Theories
TLDR
This paper uses the tools of algebraic logic to try to link the proof theoretic ideas of the Lambek calculus with the more algebraic approach taken in grammatical inference, and reconceive grammars of various types as equational theories of the syntactic concept lattice of the language. Expand
The Algebraic Theory of Context-Free Languages*
Publisher Summary This chapter discusses the several classes of sentence-generating devices that are closely related, in various ways, to the grammars of both natural languages and artificialExpand
A Learnable Representation for Syntax Using Residuated Lattices
TLDR
It is claimed that this lattice based formalism is plausibly both learnable from evidence about the grammatical strings of a language and powerful enough to represent natural languages, and thus presents a potential solution to the central problem of theoretical linguistics. Expand
The semantics of grammar formalisms seen as computer languages
TLDR
The nature of the feature systems used in augmented phrase-structure grammar formalisms, in particular those of recent versions of generalized phrase structure grammar, lexical functional grammar and PATRI1, are elucidated using Dana Scott 's domain theory, and an operation of feature generalization is found that can be used to give a partial account of the effect of coordination on syntactic features. Expand
Polynomial Identification in the Limit of Substitutable Context-free Languages
TLDR
This paper formalises the idea of substitutability introduced by Zellig Harris in the 1950s and makes it the basis for a learning algorithm from positive data only for a subclass of context-free languages, and demonstrates that an implementation of this algorithm is capable of learning a classic example of structure dependent syntax in English. Expand
Three Learnable Models for the Description of Language
TLDR
This work defines context free grammars where the non-terminals of the grammar correspond to the syntactic congruence classes, and a residuated lattice structure from the Galois connection between strings and contexts is defined, which allows a class of languages that includes some non-context free languages, many context-free languages and all regular languages. Expand
The universal automaton
TLDR
It is shown how the universal automaton gives an elegant solution to the star height problem for some classes of languages (pure-group or reversible languages). Expand
A Representation Theorem of Infinite Dimensional Algebras and Applications to Language Theory
  • G. Hotz
  • Computer Science
  • J. Comput. Syst. Sci.
  • 1986
TLDR
The algebraic theory presented here is able to show that this theory is not restricted to the context-free languages but also applies to the whole Chomsky hierarchy, in a sense dual to the theory of formal power series as introduced by M. Schutzenberger. Expand
Syntactic Semiring of a Language
  • L. Polák
  • Mathematics, Computer Science
  • MFCS
  • 2001
TLDR
Pin's refinement of Eilenberg theorem gives a one-to-one correspondence between positive varieties of rational languages and pseudovarieties of ordered monoids. Expand
...
1
2
3
4
5
...