• Computer Science
  • Published in NeurIPS 2019

SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems

@article{Wang2019SuperGLUEAS,
  title={SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems},
  author={Alex Wang and Yada Pruksachatkun and Nikita Nangia and Amanpreet Singh and Julian Michael and Felix Hill and Omer Levy and Samuel R. Bowman},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.00537}
}
In the last year, new models and methods for pretraining and transfer learning have driven striking performance improvements across a range of language understanding tasks. The GLUE benchmark, introduced one year ago, offers a single-number metric that summarizes progress on a diverse set of such tasks, but performance on the benchmark has recently come close to the level of non-expert humans, suggesting limited headroom for further research. This paper recaps lessons learned from the GLUE… CONTINUE READING

Figures, Tables, and Topics from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 13 CITATIONS

Evaluating Protein Transfer Learning with TAPE

  • NeurIPS 2019
  • 2019
VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

HuggingFace's Transformers: State-of-the-art Natural Language Processing

  • ArXiv
  • 2019
VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

LIAAD at SemDeep-5 Challenge: Word-in-Context (WiC)

VIEW 1 EXCERPT
CITES BACKGROUND
HIGHLY INFLUENCED

Transformers : State-ofthe-art Natural Language Processing

VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Green AI

VIEW 1 EXCERPT
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 74 REFERENCES

Deep Contextualized Word Representations

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL