Corpus ID: 174802633

Visualizing and Measuring the Geometry of BERT

@inproceedings{Coenen2019VisualizingAM,
  title={Visualizing and Measuring the Geometry of BERT},
  author={Andy Coenen and Emily Reif and Ann Yuan and Been Kim and Adam Pearce and Fernanda B. Vi{\'e}gas and Martin Wattenberg},
  booktitle={NeurIPS},
  year={2019}
}
  • Andy Coenen, Emily Reif, +4 authors Martin Wattenberg
  • Published in NeurIPS 2019
  • Mathematics, Computer Science
  • Transformer architectures show significant promise for natural language processing. Given that a single pretrained model can be fine-tuned to perform well on many different tasks, these networks appear to extract generally useful linguistic features. A natural question is how such networks represent this information internally. This paper describes qualitative and quantitative investigations of one particularly effective model, BERT. At a high level, linguistic features seem to be represented… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 54 CITATIONS

    What Happens To BERT Embeddings During Fine-tuning?

    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    FILTER CITATIONS BY YEAR

    2019
    2020

    CITATION STATISTICS

    • 2 Highly Influenced Citations

    • Averaged 27 Citations per year from 2019 through 2020

    • 208% Increase in citations per year in 2020 over 2019

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 30 REFERENCES

    A Structural Probe for Finding Syntax in Word Representations

    VIEW 10 EXCERPTS
    HIGHLY INFLUENTIAL