• Corpus ID: 11215251

Typesafe Abstractions for Tensor Operations

@article{Chen2017TypesafeAF,
  title={Typesafe Abstractions for Tensor Operations},
  author={Tongfei Chen},
  journal={ArXiv},
  year={2017},
  volume={abs/1710.06892}
}
We propose a typesafe abstraction to tensors (i.e. multidimensional arrays) exploiting the type-level programming capabilities of Scala through heterogeneous lists (HList), and showcase typesafe abstractions of common tensor operations and various neural layers such as convolution or recurrent neural networks. This abstraction could lay the foundation of future typesafe deep learning frameworks that runs on Scala/JVM. CCS Concepts • Software and its engineering → Software libraries and… 

Figures from this paper

Static Analysis of Python Programs using Abstract Interpretation: An Application to Tensor Shape Analysis
TLDR
A subset of Python is specified with emphasis on tensor operations, based on The Python Language Reference, and an Abstract Interpreter is defined and its implementation presented, which shows how each part of the Abstract InterPreter was built: the Abstract Domains defined and the abstract semantics.
ShapeFlow: Dynamic Shape Interpreter for TensorFlow
TLDR
ShapeFlow detects shape incompatibility errors highly accurately -- with no false positives and a single false negative -- and highly efficiently -- with an average speed-up of 499X and 24X for the first and second baseline, respectively.
Named Tensor Notation
We propose a notation for tensors with named axes, which relieves the author, reader, and future implementers from the burden of keeping track of the order of axes and the purpose of each. It also
Empowering big data analytics with polystore and strongly typed functional queries
TLDR
An implementation in Scala using Spark is developed, providing users with a type-safe and schema inference mechanism that guarantees the technical and functional correctness of composed expressions on tensors at compile time and allows to outperform Spark query optimizer using bind join.
Recognizing heterogeneous sequences by rational type expression
  • Jim E. Newton, D. Verna
  • Computer Science
    Proceedings of the 3rd ACM SIGPLAN International Workshop on Meta-Programming Techniques and Reflection
  • 2018
TLDR
The technique employs sequence recognition functions, generated at compile time, and evaluated at run-time, which extends the Common Lisp type system, exploiting the theory of rational languages, Binary Decision Diagrams, and the Turing complete macro facility of Common Lisp.

References

SHOWING 1-10 OF 16 REFERENCES
TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
TLDR
The TensorFlow interface and an implementation of that interface that is built at Google are described, which has been used for conducting research and for deploying machine learning systems into production across more than a dozen areas of computer science and other fields.
DyNet: The Dynamic Neural Network Toolkit
TLDR
DyNet is a toolkit for implementing neural network models based on dynamic declaration of network structure that has an optimized C++ backend and lightweight graph representation and is designed to allow users to implement their models in a way that is idiomatic in their preferred programming language.
Theano: A Python framework for fast computation of mathematical expressions
TLDR
The performance of Theano is compared against Torch7 and TensorFlow on several machine learning models and recently-introduced functionalities and improvements are discussed.
Type inference for array programming with dimensioned vector spaces
TLDR
Experiments show that the explicit support for linear algebra increases type safety, and that it leads to a more functional and index-free style of programming.
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
TLDR
The Tree-LSTM is introduced, a generalization of LSTMs to tree-structured network topologies that outperform all existing systems and strong LSTM baselines on two tasks: predicting the semantic relatedness of two sentences and sentiment classification.
Statically typed linear algebra in Haskell
TLDR
The idea of exposing dimensions to the type system "strongly typed linear algebra" is called and a prototype implementation is written in Haskell, which is based on Alberto Ruiz's GSLHaskell and which uses techniques from Kiselyov and Shan's "Implicit Configurations" paper.
Experience report: type-checking polymorphic units for astrophysics research in Haskell
TLDR
This work demonstrates the utility of units by writing an astrophysics research paper, free of unit concerns because every quantity expression in the paper is rigorously type-checked.
A fold for all seasons
TLDR
A normalization algorithm which automatically calculates improvements to programs expressed in a language based upon a generic promotion theorem rather than using an analysis phase to search for implicit structure, has important applications in program transformation, optimization, and theorem proving.
The NumPy Array: A Structure for Efficient Numerical Computation
TLDR
This effort shows, NumPy performance can be improved through three techniques: vectorizing calculations, avoiding copying data in memory, and minimizing operation counts.
Long Short-Term Memory
TLDR
A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
...
...