• Publications
  • Influence
Interpreting Pretrained Contextualized Representations via Reductions to Static Embeddings
Contextualized representations (e.g. ELMo, BERT) have become the default pretrained representations for downstream NLP applications. In some settings, this transition has rendered their staticExpand
  • 3
  • PDF
SPARSE: Structured Prediction using Argument-Relative Structured Encoding
We propose structured encoding as a novel approach to learning representations for relations and events in neural structured prediction. Our approach explicitly leverages the structure of availableExpand
  • 1
  • PDF
Long-Distance Dependencies Don't Have to Be Long: Simplifying through Provably (Approximately) Optimal Permutations
Neural models at the sentence level often operate on the constituent words/tokens in a way that encodes the inductive bias of processing the input in a similar fashion to how humans do. However,Expand
  • 1
  • PDF