• Publications
  • Influence
Modeling Sequences with Quantum States: A Look Under the Hood
TLDR
An understanding of the extra information contained in the reduced densities allow us to examine the mechanics of this DMRG algorithm and study the generalization error of the resulting model.
Smoothness theorem for differential BV algebras
Given a differential Batalin–Vilkovisky algebra (V, Q, Δ.) the associated odd differential graded Lie algebra (V, Q, + Δ, [,].) is always smooth formal. The quantum differential graded Lie algebra
Homotopy probability theory I
This is the first of two papers that introduce a deformation theoretic framework to explain and broaden a link between homotopy algebra and probability theory. In this paper, cumulants are proved to
Deformations of associative algebras with inner products
We develop the deformation theory of A_\infty algebras together with \infty inner products and identify a differential graded Lie algebra that controls the theory. This generalizes the deformation
Probabilistic Modeling with Matrix Product States
TLDR
An efficient training algorithm for a subset of classically simulable quantum circuit models, presented as a sequence of exactly solvable effective models, is a modification of the density matrix renormalization group procedure adapted for learning a probability distribution.
Tensor Networks for Probabilistic Sequence Modeling
TLDR
A novel generative algorithm is introduced giving trained u-MPS the ability to efficiently sample from a wide variety of conditional distributions, each one defined by a regular expression, which permits the generation of richly structured text in a manner that has no direct analogue in current generative models.
Language as a matrix product state
We propose a statistical model for natural language that begins by considering language as a monoid, then representing it in complex matrices with a compatible translation invariant probability
Algebras over Cobar(coFrob)
We show that a square zero, degree one element in W(V), the Weyl algebra on a vector space V, is equivalent to providing V with the structure of an algebra over the properad Cobar(coFrob), the
Tensor Networks for Language Modeling
TLDR
A uniform matrix product state (u-MPS) model for probabilistic modeling of sequence data that has the ability to condition or marginalize sampling on characters at arbitrary locations within a sequence, with no need for approximate sampling methods.
...
1
2
3
...