• Publications
  • Influence
What do Neural Machine Translation Models Learn about Morphology?
TLDR
This work analyzes the representations learned by neural MT models at various levels of granularity and empirically evaluates the quality of the representations for learning morphology through extrinsic part-of-speech and morphological tagging tasks.
Robust Classification of Crisis-Related Data on Social Networks Using Convolutional Neural Networks
TLDR
This work introduces neural network based classification methods for identifying useful tweets during a crisis situation and makes the best use of the out-of-event data and achieves good results at the onset of a disaster.
Verifiably Effective Arabic Dialect Identification
TLDR
It is shown that effective dialect identification requires that the distinguishing lexical, morphological, and phonological phenomena of dialects are accounted for, and that accounting for such can improve dialect detection accuracy by nearly 10% absolute.
Applications of Online Deep Learning for Crisis Response Using Social Media Information
TLDR
A new online algorithm based on stochastic gradient descent is proposed to train DNNs in an online fashion during disaster situations to address two types of information needs of response organizations: identifying informative tweets and classifying them into topical classes.
Evaluating Layers of Representation in Neural Machine Translation on Part-of-Speech and Semantic Tagging Tasks
TLDR
This paper investigates the quality of vector representations learned at different layers of NMT encoders and finds that higher layers are better at learning semantics while lower layers tend to be better for part-of-speech tagging.
Rapid Classification of Crisis-Related Data on Social Networks using Convolutional Neural Networks
TLDR
This work introduces neural network based classification methods for binary and multi-class tweet classification task and shows that these models do not require any feature engineering and perform better than state-of-the-art methods.
The AMARA Corpus: Building Parallel Language Resources for the Educational Domain
TLDR
The AMARA corpus of on-line educational content is presented: a new parallel corpus of educational video subtitles, multilingually aligned for 20 languages, i.e. 20 monolingual corpora and 190 parallel corpora.
Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation
We address the problem of simultaneous translation by modifying the Neural MT decoder to operate with dynamically built encoder and attention. We propose a tunable agent which decides the best
Identifying and Controlling Important Neurons in Neural Machine Translation
TLDR
It is shown experimentally that translation quality depends on the discovered neurons, and how to control NMT translations in predictable ways, by modifying activations of individual neurons.
Poor Man's BERT: Smaller and Faster Transformer Models
TLDR
A number of memory-light model reduction strategies that do not require model pre-training from scratch are explored, which are able to prune BERT, RoBERTa and XLNet models by up to 40%, while maintaining up to 98% of their original performance.
...
1
2
3
4
5
...