Corpus ID: 32481355

A Deep Semantic Natural Language Processing Platform

@inproceedings{Gardner2017ADS,
  title={A Deep Semantic Natural Language Processing Platform},
  author={Matt Gardner and Joel Grus and Mark Neumann and Oyvind Tafjord and Pradeep Dasigi and Nelson H S Liu and Matthew E. Peters and Michael Schmitz and Luke Zettlemoyer},
  year={2017}
}
This paper describes AllenNLP, a platform for research on deep learning methods in natural language understanding. AllenNLP is designed to support researchers who want to build novel language understanding models quickly and easily. It is built on top of PyTorch, allowing for dynamic computation graphs, and provides (1) a flexible data API that handles intelligent batching and padding, (2) highlevel abstractions for common operations in working with text, and (3) a modular and extensible… Expand
EXPLORING PRETRAINED MODELS FOR JOINT MORPHO-SYNTACTIC PARSING OF RUSSIAN
TLDR
A joint morpho-syntactic parser for Russian is built which is significantly faster and as accurate as a traditional pipeline of models and it is proved that character-level word embeddings can significantly improve parsing quality. Expand
Compositional Semantic Parsing across Graphbanks
TLDR
A compositional neural semantic parser which achieves competitive accuracies across a diverse range of graphbanks for the first time, and incorporating BERT embeddings and multi-task learning improves the accuracy further. Expand
Neighbor Contextual Information Learners for Joint Intent and Slot Prediction
Intent Identification and Slot Identification aretwo important task for Natural Language Understanding(NLU). Exploration in this areahave gained significance using networks likeRNN, LSTM and GRU.Expand
Strong and Light Baseline Models for Fact-Checking Joint Inference
TLDR
This paper shows that in case of inference based on transformer models, two effective approaches use either a simple application of max pooling over the Transformer evidence vectors; or computing a weighted sum of the evidence vectors. Expand
Knowledge Generation - Variational Bayes on Knowledge Graphs
TLDR
This thesis is a proof of concept for the potential of Variational Auto-Encoder (VAE) on representation learning of real-world Knowledge Graphs (KG) and presents a new validation method for generated triples from the FB15K-237 dataset. Expand
Discourse-Based Approach to Involvement of Background Knowledge for Question Answering
TLDR
A concept of a virtual discourse tree to improve question answering (Q/A) recall for complex, multi-sentence questions and observes a substantial increase of performance answering complex questions such as Yahoo! Answers and www.2carpros.com. Expand
Beyond The Text: Analysis of Privacy Statements through Syntactic and Semantic Role Labeling
TLDR
It is demonstrated that a solution combining syntactic DP coupled with type-specific SRL tasks provides the highest accuracy for retrieving contextual privacy parameters from privacy statements, thus inspiring new NLP research to address this important problem in the privacy domain. Expand
On GAP Coreference Resolution Shared Task: Insights from the 3rd Place Solution
TLDR
The approach adopted consists of fine-tuning the BERT language representation model and the usage of external datasets during the training process and the resulting system almost eliminates the difference in log loss per gender during the cross-validation, while providing high performance. Expand
Delving into Deep Imbalanced Regression
TLDR
Motivated by the intrinsic difference between categorical and continuous label space, this work proposes distribution smoothing for both labels and features, which explicitly acknowledges the effects of nearby targets, and calibrates both label and learned feature distributions. Expand
Bayesian Attention Belief Networks
TLDR
On a variety of language understanding tasks, this paper shows that the proposed Bayesian attention belief networks method outperforms deterministic attention and state-of-the-art stochastic attention in accuracy, uncertainty estimation, generalization across domains, and robustness to adversarial attacks. Expand
...
1
2
...

References

SHOWING 1-10 OF 25 REFERENCES
A large annotated corpus for learning natural language inference
TLDR
The Stanford Natural Language Inference corpus is introduced, a new, freely available collection of labeled sentence pairs, written by humans doing a novel grounded task based on image captioning, which allows a neural network-based model to perform competitively on natural language inference benchmarks for the first time. Expand
End-to-end learning of semantic role labeling using recurrent neural networks
TLDR
This work proposes to use deep bi-directional recurrent network as an end-to-end system for SRL, which takes only original text information as input feature, without using any syntactic knowledge. Expand
Natural Language Processing with Python
This book offers a highly accessible introduction to natural language processing, the field that supports a variety of language technologies, from predictive text and email filtering to automaticExpand
Deep Semantic Role Labeling: What Works and What's Next
We introduce a new deep learning model for semantic role labeling (SRL) that significantly improves the state of the art, along with detailed analyses to reveal its strengths and limitations. We useExpand
A Decomposable Attention Model for Natural Language Inference
We propose a simple neural architecture for natural language inference. Our approach uses attention to decompose the problem into subproblems that can be solved separately, thus making it triviallyExpand
ParlAI: A Dialog Research Software Platform
TLDR
ParlAI (pronounced “par-lay”), an open-source software platform for dialog research implemented in Python, is introduced, to provide a unified framework for sharing, training and testing dialog models; integration of Amazon Mechanical Turk for data collection, human evaluation, and online/reinforcement learning. Expand
The Stanford CoreNLP Natural Language Processing Toolkit
TLDR
The design and use of the Stanford CoreNLP toolkit is described, an extensible pipeline that provides core natural language analysis, and it is suggested that this follows from a simple, approachable design, straightforward interfaces, the inclusion of robust and good quality analysis components, and not requiring use of a large amount of associated baggage. Expand
Question-Answer Driven Semantic Role Labeling: Using Natural Language to Annotate Natural Language
TLDR
The results show that non-expert annotators can produce high quality QA-SRL data, and also establish baseline performance levels for future work on this task, and introduce simple classifierbased models for predicting which questions to ask and what their answers should be. Expand
Bidirectional Attention Flow for Machine Comprehension
TLDR
The BIDAF network is introduced, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Expand
Semi-supervised sequence tagging with bidirectional language models
TLDR
A general semi-supervised approach for adding pre- trained context embeddings from bidirectional language models to NLP systems and apply it to sequence labeling tasks, surpassing previous systems that use other forms of transfer or joint learning with additional labeled data and task specific gazetteers. Expand
...
1
2
3
...