• Publications
  • Influence
TensorFlow: A system for large-scale machine learning
TLDR
TensorFlow is a machine learning system that operates at large scale and in heterogeneous environments. Expand
  • 8,810
  • 1069
  • PDF
TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
TLDR
This paper describes the TensorFlow interface for expressing machine learning algorithms, and an implementation of that interface that we have built at Google. Expand
  • 8,049
  • 913
  • PDF
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
TLDR
Neural Machine Translation (NMT) is an end-to-end learning approach for automated translation, with the potential to overcome many of the weaknesses of conventional phrase-based translation systems. Expand
  • 3,292
  • 268
  • PDF
Natural TTS Synthesis by Conditioning Wavenet on MEL Spectrogram Predictions
TLDR
This paper describes Tacotron 2, a neural network architecture for speech synthesis directly from text. Expand
  • 775
  • 183
  • PDF
Tacotron: Towards End-to-End Speech Synthesis
TLDR
We present Tacotron, an end-to-end generative TTS model that synthesizes speech directly from characters, outperforming a production parametric system in terms of naturalness. Expand
  • 645
  • 140
  • PDF
Google’s Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
TLDR
We propose a simple solution to use a single Neural Machine Translation (NMT) model to translate between multiple languages using a single model, taking advantage of multilingual data to improve NMT. Expand
  • 988
  • 118
  • PDF
Fast and memory-efficient regular expression matching for deep packet inspection
TLDR
Packet content scanning at high speed has become extremely important due to its applications in network security, network monitoring, HTTP load balancing, etc. Expand
  • 459
  • 50
  • PDF
GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism
TLDR
We introduce GPipe, a pipeline parallelism library that allows scaling any network that can be expressed as a sequence of layers. Expand
  • 322
  • 47
  • PDF
State-of-the-Art Speech Recognition with Sequence-to-Sequence Models
TLDR
In this work, we explore a variety of structural and optimization improvements to our LAS model which significantly improve performance. Expand
  • 615
  • 30
  • PDF
C-Miner: Mining Block Correlations in Storage Systems
TLDR
We propose C-Miner, an algorithm which uses a data mining technique called frequent sequence mining to discover block correlations in storage systems. Expand
  • 192
  • 29
  • PDF