• Corpus ID: 209832351

Meaning updating of density matrices

  title={Meaning updating of density matrices},
  author={Bob Coecke and Konstantinos Meichanetzidis},
The DisCoCat model of natural language meaning assigns meaning to a sentence given: (i) the meanings of its words, and, (ii) its grammatical structure. The recently introduced DisCoCirc model extends this to text consisting of multiple sentences. While in DisCoCat all meanings are fixed, in DisCoCirc each sentence updates meanings of words. In this paper we explore different update mechanisms for DisCoCirc, in the case where meaning is encoded in density matrices---which come with several… 

The Mathematics of Text Structure

  • B. Coecke
  • Linguistics
    Joachim Lambek: The Interplay of Mathematics, Logic, and Linguistics
  • 2021
Both the compositional formalism and suggested meaning model are highly quantum-inspired, and implementation on a quantum computer would come with a range of benefits.

Putting a Spin on Language: A Quantum Interpretation of Unary Connectives for Linguistic Applications

This work extends the interpretation of the type system with an extra spin density matrix space and introduces a way of simultaneously representing co-existing interpretations of ambiguous utterances, and provides a uniform framework for the integration of lexical and derivational ambiguity.

Cats climb entails mammals move: preserving hyponymy in compositional distributional semantics

This paper proposes that psd matrices for verbs, adjectives, and other functional words be lifted to completely positive maps that match their grammatical type, and gives a number of proposals for the structure of Compr, based on spiders, cups and caps, and generates a range of composition rules.

Foundations for Near-Term Quantum Natural Language Processing

The encoding of linguistic structure within quantum circuits also embodies a novel approach for establishing word-meanings that goes beyond the current standards in mainstream AI, by placing linguistic structure at the heart of Wittgenstein's meaning-is-context.

Talking Space: inference from spatial linguistic meanings

A mechanism for how space and linguistic structure can be made to interact in a matching compositional fashion and how linguistic model of space can interact with other such models related to the authors' senses and/or embodiment, such as the conceptual spaces of colour, taste and smell are proposed.

Compositionality as we see it, everywhere around us

Inspired by work on compositionality in quantum theory, and categorical quantum mechanics in particular, the notions of Schrödinger, Whitehead, and complete compositionality are proposed, which aim to capture the fact that compositionality is at its best when it is ‘real’, ‘non-trivial’ and even more when it also is “complete’.

The Safari of Update Structures: Visiting the Lens and Quantum Enclosures

It is shown that update structures survive decoherence and are sufficiently general to capture quantum observables, pinpointing the additional assumptions required to make the two coincide.

Modelling Lexical Ambiguity with Density Matrices

Three new neural models for learning density matrices from a corpus are presented and their ability to discriminate between word senses on a range of compositional datasets is tested, and the best model outperforms existing vector-based compositional models as well as strong sentence encoders.

Parsing conjunctions in DisCoCirc

In distributional compositional models of meaning logical words require special interpretations, that specify the way in which other words in the sentence interact with each other. So far within the

Fibrational linguistics: First concepts

We define a general mathematical framework for linguistics based on the theory of fibrations, called FibLang. We start by modelling the interaction between linguistics and cognition in the most general



The Mathematics of Text Structure

  • B. Coecke
  • Linguistics
    Joachim Lambek: The Interplay of Mathematics, Logic, and Linguistics
  • 2021
Both the compositional formalism and suggested meaning model are highly quantum-inspired, and implementation on a quantum computer would come with a range of benefits.

Mathematical Foundations for a Compositional Distributional Model of Meaning

A mathematical framework for a unification of the distributional theory of meaning in terms of vector space models and a compositional theory for grammatical types, for which the type reductions of Pregroups are lifted to morphisms in a category, a procedure that transforms meanings of constituents into a meaning of the (well-typed) whole.

Open System Categorical Quantum Semantics in Natural Language Processing

This paper shows that further developments in categorical quantum mechanics are relevant to natural language processing too, and provides preliminary evidence of the validity of the proposed new model for word meaning by demonstrating a passage from the vector space model to density matrices.

Ambiguity in Categorical Models of Meaning

This work describes ambiguous words as statistical ensembles of unambiguous concepts and extend the semantics of the previous model to a category that supports probabilistic mixing, and introduces two different Frobenius algebras representing different ways of composing the meaning of words.

Compositional distributional semantics with compact closed categories and Frobenius algebras

In the proposed extension, the concept of a distributional vector is replaced with that of a density matrix, which compactly represents a probability distribution over the potential different meanings of the specific word.

Dual Density Operators and Natural Language Meaning

It is shown that dual density operators can be used to simultaneously represent: ambiguity about word meanings, and lexical entailment, within a grammatical-compositional distributional framework for natural language meaning.

Internal Wiring of Cartesian Verbs and Prepositions

This work establishes the same for a large class of well-behaved transitive verbs to which they refer as Cartesian verbs, and reduces the meaning space from a ternary tensor to a unary one.

Graded Entailment for Compositional Distributional Semantics

The main theorem shows that entailment strength lifts compositionally to the sentence level, giving a lower bound on sentence entailment.

Distributional Structure

This discussion will discuss how each language can be described in terms of a distributional structure, i.e. in Terms of the occurrence of parts relative to other parts, and how this description is complete without intrusion of other features such as history or meaning.