Share This Author
Deep Contextualized Word Representations
A new type of deep contextualized word representation is introduced that models both complex characteristics of word use and how these uses vary across linguistic contexts, allowing downstream models to mix different types of semi-supervision signals.
AllenNLP: A Deep Semantic Natural Language Processing Platform
AllenNLP is described, a library for applying deep learning methods to NLP research that addresses issues with easy-to-use command-line tools, declarative configuration-driven experiments, and modular NLP abstractions.
Knowledge Enhanced Contextual Word Representations
After integrating WordNet and a subset of Wikipedia into BERT, the knowledge enhanced BERT (KnowBert) demonstrates improved perplexity, ability to recall facts as measured in a probing task and downstream performance on relationship extraction, entity typing, and word sense disambiguation.
Dissecting Contextual Word Embeddings: Architecture and Representation
There is a tradeoff between speed and accuracy, but all architectures learn high quality contextual representations that outperform word embeddings for four challenging NLP tasks, suggesting that unsupervised biLMs, independent of architecture, are learning much more about the structure of language than previously appreciated.
ScispaCy: Fast and Robust Models for Biomedical Natural Language Processing
ScispaCy, a new Python library and models for practical biomedical/scientific text processing, which heavily leverages the spaCy library is described, which detail the performance of two packages of models released in scispa Cy and demonstrate their robustness on several tasks and datasets.
Ontology alignment in the biomedical domain using entity definitions and context
- Lucy Lu Wang, Chandra Bhagavatula, Mark Neumann, Kyle Lo, Christopher Wilhelm, Waleed Ammar
- Computer Science, PhilosophyBioNLP
- 20 June 2018
This work proposes a method for enriching entities in an ontology with external definition and context information, and uses this additional information for ontology alignment, and develops a neural architecture capable of encoding the additional information when available.
Grammar-based Neural Text-to-SQL Generation
The sequence-to-sequence paradigm employed by neural text-to-SQL models typically performs token-level decoding and does not consider generating SQL hierarchically from a grammar. Grammar-based…
A Deep Semantic Natural Language Processing Platform
AllenNLP is designed to support researchers who want to build novel language understanding models quickly and easily and provides a flexible data API that handles intelligent batching and padding, and a modular and extensible experiment framework that makes doing good science easy.
Writing Code for NLP Research
This tutorial aims to share best practices for writing code for NLP research, drawing on the instructors' experience designing the recently-released AllenNLP toolkit, a PyTorch-based library for deep learning NLPResearch.
Learning to Reason With Adaptive Computation
This work introduces the first model involving Adaptive Computation Time which provides a small performance benefit on top of a similar model without an adaptive component as well as enabling considerable insight into the reasoning process of the model.