• Computer Science
  • Published in ArXiv 2016

Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation

@article{Wu2016GooglesNM,
  title={Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation},
  author={Yonghui Wu and Mike Schuster and Zhifeng Chen and Quoc V. Le and Mohammad Norouzi and Wolfgang Macherey and Maxim Krikun and Yuan Cao and Qin Gao and Klaus Macherey and Jeff Klingner and Apurva Shah and Melvin Johnson and Xiaobing Liu and Lukasz Kaiser and Stephan Gouws and Yoshikiyo Kato and Taku Kudo and Hideto Kazawa and Keith Stevens and George Kurian and Nishant Patil and Wei Wang and Cliff Young and Jason Smith and Jason Riesa and Alex Rudnick and Oriol Vinyals and Gregory S. Corrado and Macduff Hughes and Jeffrey Dean},
  journal={ArXiv},
  year={2016},
  volume={abs/1609.08144}
}
Highlight Information
Neural Machine Translation (NMT) is an end-to-end learning approach for automated translation, with the potential to overcome many of the weaknesses of conventional phrase-based translation systems. [...] Key Method This method provides a good balance between the flexibility of "character"-delimited models and the efficiency of "word"-delimited models, naturally handles translation of rare words, and ultimately improves the overall accuracy of the system.Expand Abstract

Citations

Publications citing this paper.
SHOWING 1-10 OF 1,704 CITATIONS

An effective method for operations placement in Tensor Flow

VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Boosting LSTM Performance Through Dynamic Precision Selection

VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

CGPA: Coarse-Grained Pruning of Activations for Energy-Efficient RNN Inference

VIEW 12 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Demystifying the MLPerf Benchmark Suite

VIEW 7 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Improving ML Applications in Shared Computing Environments

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Interpreting Black Box Models via Hypothesis Testing.

VIEW 7 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Interpreting Black Box Models with Statistical Guarantees

VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

PipeDream: generalized pipeline parallelism for DNN training

VIEW 11 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Gandiva: Introspective Cluster Scheduling for Deep Learning

VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2015
2020

CITATION STATISTICS

  • 152 Highly Influenced Citations

  • Averaged 539 Citations per year from 2017 through 2019

  • 30% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 40 REFERENCES

Adam: A Method for Stochastic Optimization

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Bidirectional recurrent neural networks

VIEW 13 EXCERPTS

Japanese and Korean voice search

VIEW 4 EXCERPTS

Neural Machine Translation by Jointly Learning to Align and Translate

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL