• Corpus ID: 232068734

Named Tensor Notation

@article{Chiang2021NamedTN,
  title={Named Tensor Notation},
  author={David Chiang and Alexander M. Rush and Boaz Barak},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.13196}
}
We propose a notation for tensors with named axes, which relieves the author, reader, and future implementers from the burden of keeping track of the order of axes and the purpose of each. It also makes it easy to extend operations on low-order tensors to higher order ones (e.g., to extend an operation on images to minibatches of images, or extend the attention mechanism to multiple attention heads). After a brief overview of our notation, we illustrate it through several examples from modern… 
Proof Net Structure for Neural Lambek Categorial Parsing
TLDR
This paper presents the first statistical parser for Lambek categorial grammar (LCG), a grammatical formalism for which the graphical proof method known as *proof nets* is applicable, and derives novel loss functions by expressing proof net constraints as differentiable functions of the model output.
Correct Compilation of Semiring Contractions
We introduce a formal operational semantics that describes the fused execution of variable contraction problems, which compute indexed arithmetic over a semiring and generalize sparse and dense
Visualizing quantum mechanics in an interactive simulation – Virtual Lab by Quantum Flytrap
. Virtual Lab by Quantum Flytrap explores novel ways to represent quantum phenomena in an interactive and intuitive way. It is a no-code online laboratory with a real-time simulation of an optical
Bayesian machine learning analysis
Multi-wavelength single-molecule fluorescence colocalization (CoSMoS) methods 7 allow elucidation of complex biochemical reaction mechanisms. However, analysis of CoSMoS 8 data is intrinsically
Bayesian machine learning analysis of single-molecule fluorescence colocalization images
TLDR
This work uses Bayesian probabilistic programming to implement Tapqir, an unsupervised machine learning method based on a holistic, physics-based causal model of CoSMoS data that objectively assigns spot classification probabilities that allow accurate downstream analysis of molecular dynamics, thermodynamics, and kinetics.

References

SHOWING 1-10 OF 10 REFERENCES
Attention is All you Need
TLDR
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Typesafe Abstractions for Tensor Operations
TLDR
A typesafe abstraction to tensors exploiting the type-level programming capabilities of Scala through heterogeneous lists (HList) and various neural layers such as convolution or recurrent neural networks could lay the foundation of future typesafe deep learning frameworks that runs on Scala/JVM.
PyTorch: An Imperative Style, High-Performance Deep Learning Library
TLDR
This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.
Array programming with NumPy
TLDR
How a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data is reviewed.
Dex: array programming with typed indices
TLDR
Dex is described, a functional array processing language in the Haskell/ML family that introduces a lightweight looping construct and a type system that captures common patterns of array shapes.
PyTorch: An imperative style
  • 2019
Tensor shape (annotation) library. Open-source software. Torch Contributors. 2019. Named tensors
  • 2018
Tensor shape (annotation) library
  • Open-source software.
  • 2018
Named tensors. Open-source software
  • 2019
Named tensors
  • Open-source software.
  • 2019