• Publications
  • Influence
Submodular Optimization-based Diverse Paraphrasing and its Effectiveness in Data Augmentation
TLDR
We provide a novel formulation of the problem in terms of monotone submodular function maximization, specifically targeted towards the task of paraphrasing. Expand
  • 24
  • 5
  • PDF
On the Computational Power of Transformers and Its Implications in Sequence Modeling
TLDR
We first provide an alternate and simpler proof to show that vanilla Transformers are Turing-complete and then we prove that Transformers with only positional masking and without any positional encoding are also Turing- complete. Expand
  • 5
  • PDF
Unsung Challenges of Building and Deploying Language Technologies for Low Resource Language Communities
TLDR
In this paper, we examine and analyze the challenges associated with developing and introducing language technologies to low-resource language communities. Expand
  • 3
  • PDF
Are NLP Models really able to Solve Simple Math Word Problems?
TLDR
The problem of designing NLP solvers for math word problems (MWP) has seen sustained research activity and steady gains in the test accuracy. Expand
  • 1
  • PDF
On the Ability of Self-Attention Networks to Recognize Counter Languages
TLDR
Transformers have supplanted recurrent models in a large number of NLP tasks. Expand
  • 1
  • PDF
On the Ability and Limitations of Transformers to Recognize Formal Languages
TLDR
Transformers have supplanted recurrent models in a large number of NLP tasks. Expand
  • 1
  • PDF
On the Practical Ability of Recurrent Neural Networks to Recognize Hierarchical Languages
TLDR
We study the performance of recurrent models on Dyck-n languages, a particularly important and well-studied class of CFLs. Expand