Continual Learning with Knowledge Transfer for Sentiment Classification

  title={Continual Learning with Knowledge Transfer for Sentiment Classification},
  author={Zixuan Ke and Bing Liu and Hao Wang and Lei Shu},
This paper studies continual learning (CL) for sentiment classification (SC). In this setting, the CL system learns a sequence of SC tasks incrementally in a neural network, where each task builds a classifier to classify the sentiment of reviews of a particular product category or domain. Two natural questions are: Can the system transfer the knowledge learned in the past from the previous tasks to the new task to help it learn a better model for the new task? And, can old models for previous… 

Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks

A novel capsule network based model called B-CL markedly improves the ASC performance on both the new task and the old tasks via forward and backward knowledge transfer.

CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks

A novel model called CLASSIC is proposed that enables both knowledge transfer across tasks and knowledge distillation from old tasks to the new task, which eliminates the need for task ids in testing.

Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks

This paper proposes a technique to learn a sequence of mixed similar and dissimilar tasks that can deal with forgetting and also transfer knowledge forward and backward, and demonstrates the effectiveness of the proposed model.

Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning

A novel model called CTR is proposed to solve the problems of overcoming catastrophic forgetting and encouraging knowledge transfer across tasks in continuous learning, and the experimental results demonstrate the effectiveness of CTR.

Memory Efficient Continual Learning for Neural Text Classification

This work devise a method to perform text classification using pre-trained models on a sequence of classification tasks provided in sequence and retains a predictive performance on-par with state of the art but less memory efficient methods.

Continual Learning by Using Information of Each Class Holistically

This paper proposes a one-class learning based technique for CL, which considers features of each class holistically rather than only the discriminative information for classifying the classes seen so far, and represents a new approach to solving the CL problem.

Modeling a Functional Engine for the Opinion Mining as a Service using Compounded Score Computation and Machine Learning

This paper proposes a design framework of the evolution of the classification engine for opinion mining using score-based computation using a customized Vader algorithm and a machine learning model that supports a large corpus of unstructured text data classifications.

Vector based sentiment and emotion analysis from text: A survey



Forward and Backward Knowledge Transfer for Sentiment Classification

Reverse knowledge transfer of LL is studied in the context of naive Bayesian (NB) classification to improve the model of a previous task by leveraging future knowledge without retraining using its training data.

Lifelong Learning for Sentiment Classification

The proposed LL approach adopts a Bayesian optimization framework based on stochastic gradient descent, which demonstrates that lifelong learning is a promising research direction.

Continual learning: A comparative study on how to defy forgetting in classification tasks

This work focuses on task-incremental classification, where tasks arrive in a batch-like fashion, and are delineated by clear boundaries, and studies the influence of model capacity, weight decay and dropout regularization, and the order in which the tasks are presented, to compare methods in terms of required memory, computation time and storage.

Gradient Episodic Memory for Continual Learning

A model for continual learning, called Gradient Episodic Memory (GEM) is proposed that alleviates forgetting, while allowing beneficial transfer of knowledge to previous tasks.

Distantly Supervised Lifelong Learning for Large-Scale Social Media Sentiment Analysis

The results prove that the lifelong sentiment learning approach is feasible and effective to tackle the challenges of continuously updated texts with dynamic topics in social media and proves that the belief “the more training data the better performance” does not hold in large-scale social media sentiment analysis.

Lifelong Machine Learning Systems: Beyond Learning Algorithms

It is proposed that it is now appropriate for the AI community to move beyond learning algorithms to more seriously consider the nature of systems that are capable of learning over a lifetime.

Overcoming Catastrophic Forgetting by Incremental Moment Matching

IMM incrementally matches the moment of the posterior distribution of the neural network which is trained on the first and the second task, respectively to make the search space of posterior parameter smooth.

Lifelong Machine Learning, Second Edition

The purpose of this second edition is to expand the definition of lifelong learning, update the content of several chapters, and add a new chapter about continual learning in deep neural networks—which has been actively researched over the past two or three years.

Overcoming catastrophic forgetting with hard attention to the task

A task-based hard attention mechanism that preserves previous tasks' information without affecting the current task's learning, and features the possibility to control both the stability and compactness of the learned knowledge, which makes it also attractive for online learning or network compression applications.

Lifelong Machine Learning

  • Zhiyuan ChenB. Liu
  • Computer Science, Education
    Synthesis Lectures on Artificial Intelligence and Machine Learning
  • 2016
As statistical machine learning matures, it is time to make a major effort to break the isolated learning tradition and to study lifelong learning to bring machine learning to new heights.