Asking without Telling: Exploring Latent Ontologies in Contextual Representations
@article{Michael2020AskingWT, title={Asking without Telling: Exploring Latent Ontologies in Contextual Representations}, author={Julian Michael and Jan A. Botha and Ian Tenney}, journal={ArXiv}, year={2020}, volume={abs/2004.14513} }
The success of pretrained contextual encoders, such as ELMo and BERT, has brought a great deal of interest in what these models learn: do they, without explicit supervision, learn to encode meaningful notions of linguistic structure? If so, how is this structure encoded? To investigate this, we introduce latent subclass learning (LSL): a modification to existing classifier-based probing methods that induces a latent categorization (or ontology) of the probe's inputs. Without access to fine… CONTINUE READING
Figures and Tables from this paper
2 Citations
How Far Does BERT Look At: Distance-based Clustering and Analysis of BERT's Attention
- Computer Science
- COLING
- 2020
- PDF
References
SHOWING 1-10 OF 91 REFERENCES
Perturbed Masking: Parameter-free Probing for Analyzing and Interpreting BERT
- Computer Science
- ACL
- 2020
- 6
- PDF
What do you learn from context? Probing for sentence structure in contextualized word representations
- Computer Science
- ICLR
- 2019
- 250
- PDF
Dissecting Contextual Word Embeddings: Architecture and Representation
- Computer Science
- EMNLP
- 2018
- 175
- Highly Influential
- PDF
Linguistic Knowledge and Transferability of Contextual Representations
- Computer Science
- NAACL-HLT
- 2019
- 227
- Highly Influential
- PDF
What Does BERT Look At? An Analysis of BERT's Attention
- Computer Science
- ACL 2019
- 2019
- 342
- Highly Influential
- PDF
A Matter of Framing: The Impact of Linguistic Formalism on Probing Results
- Computer Science
- EMNLP
- 2020
- 2
- PDF