Share This Author
Modeling Label Semantics for Predicting Emotional Reactions
- Radhika Gaonkar, Heeyoung Kwon, Mohaddeseh Bastan, Niranjan Balasubramanian, Nathanael Chambers
- Computer ScienceACL
- 9 June 2020
This work explicitly model label classes via label embeddings, and adds mechanisms that track label-label correlations both during training and inference, and introduces a new semi-supervision strategy that regularizes for the correlations on unlabeled data.
MDP-based Itinerary Recommendation using Geo-Tagged Social Media
This paper leverages social media, more explicitly photo uploads and their tags, to reverse engineer historic user itineraries and observes that the predicted itineraries are more accurate than standard path planning algorithms.
Knowledge Infused Decoding
Knowledge Infused Decoding (KID)—a novel decoding algorithm for generative LMs, which dynamically infuses external knowledge into each step of the LM decoding, which maintains a local knowledge memory based on the current context, interacting with a dynamically created external knowledge trie, and continuously update the local memory as a knowledge-aware constraint to guide decoding via reinforcement learning.
Exploring Low-Cost Transformer Model Compression for Large-Scale Commercial Reply Suggestions
- Vaishnavi Shrivastava, Radhika Gaonkar, Shashank Gupta, Abhishek Jha
- Computer ScienceArXiv
- 27 November 2021
Low-cost model compression techniques like Layer Dropping and Layer Freezing are demonstrated to demonstrate the efficacy of these techniques in large-data scenarios, enabling the training time reduction for a commercial email reply suggestion system by 42%, without affecting the model relevance or user engagement.