Transformers to Learn Hierarchical Contexts in Multiparty Dialogue for Span-based Question Answering
- Changmao Li, Jinho D. Choi
- Computer ScienceAnnual Meeting of the Association for…
- 7 April 2020
A novel approach to transformers that learns hierarchical representations in multiparty dialogue that learns both token and utterance embeddings for better understanding in dialogue contexts is introduced.
Transformer-based Context-aware Sarcasm Detection in Conversation Threads from Social Media
- Xiangjue Dong, Changmao Li, Jinho D. Choi
- Computer ScienceFIGLANG
- 22 May 2020
A transformer-based sarcasm detection model that accounts for the context from the entire conversation thread for more robust predictions is presented, becoming one of the highest performing systems among 36 participants in this shared task.
Competence-Level Prediction and Resume-Job_Description Matching Using Context-Aware Transformer Models
- Changmao Li, E. Fisher, Rebecca Thomas, S. Pittard, V. Hertzberg, Jinho D. Choi
- Computer ScienceConference on Empirical Methods in Natural…
- 1 November 2020
Novel transformer-based classification models are developed for resume classification to reduce the time and labor needed to screen an overwhelming number of applications significantly, while improving the selection of suitable candidates.
Design and Challenges of Cloze-Style Reading Comprehension Tasks on Multiparty Dialogue
- Changmao Li, Tianhao Liu, Jinho D. Choi
- Computer ScienceArXiv
- 2 November 2019
This paper analyzes challenges in cloze-style reading comprehension on multiparty dialogue and suggests two new tasks for more comprehensive predictions of personal entities in daily conversations and a thorough error analysis is provided.
Challenging On Car Racing Problem from OpenAI gym
- Changmao Li
- Computer ScienceArXiv
- 2 November 2019
This project challenges the car racing problem from OpenAI gym environment and draws a conclusion that for limited hardware resources, using genetic multi-layer perceptron sometimes can be more efficient.
Improving Neural Machine Translation with the Abstract Meaning Representation by Combining Graph and Sequence Transformers
- Changmao Li, Jeffrey Flanigan
- Computer ScienceDLG4NLP
- 2022
A novel encoder-decoder architecture which augments the Transformer model with a Heterogeneous Graph Transformer which encodes source sentence AMR graphs which outperforms the Trans transformer model and previous non-Transformer based models on two different language pairs in both the high resource setting and low resource setting.
Ten-year Survival Prediction for Breast Cancer Patients
- Changmao Li, Han He, Yunze Hao, C. Ziems
- MedicineArXiv
- 2 November 2019
This report assesses different machine learning approaches to 10-year survival prediction of breast cancer patients.