Multi-Relation Message Passing for Multi-Label Text Classification

  title={Multi-Relation Message Passing for Multi-Label Text Classification},
  author={Muberra Ozmen and Hao Zhang and Pengyun Wang and Mark J. Coates},
  journal={ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  • Muberra OzmenHao Zhang M. Coates
  • Published 10 February 2022
  • Computer Science, Business
  • ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
A well-known challenge associated with the multi-label classification problem is modelling dependencies between labels. Most attempts at modelling label dependencies focus on co-occurrences, ignoring the valuable information that can be extracted by detecting label subsets that rarely occur together. For example, consider customer product reviews; a product probably would not simultaneously be tagged by both "recommended" (i.e., reviewer is happy and recommends the product) and "urgent" (i.e… 

Figures and Tables from this paper

GCN-BERT and Memory Network Based Multi-Label Classification for Event Text of the Chinese Government Hotline

  • B. Liu
  • Computer Science
    IEEE Access
  • 2022
Comparison experimental results show that the proposed framework outperforms all baselines and ablation studies demonstrate the effectiveness of each module.



Neural Message Passing for Multi-Label Classification

The proposed Label Message Passing (LaMP) Neural Networks to efficiently model the joint prediction of multiple labels are simple, accurate, interpretable, structure-agnostic, and applicable for predicting dense labels since LaMP is incredibly parallelizable.

Classifier chains for multi-label classification

This paper presents a novel classifier chains method that can model label correlations while maintaining acceptable computational complexity, and illustrates the competitiveness of the chaining method against related and state-of-the-art methods, both in terms of predictive performance and time complexity.

Maximizing Subset Accuracy with Recurrent Neural Networks in Multi-label Classification

This paper replaces classifier chains with recurrent neural networks, a sequence-to-sequence prediction algorithm which has recently been successfully applied to sequential prediction tasks in many domains, and compares different ways of ordering the label set, and gives some recommendations on suitable ordering strategies.

ML-KNN: A lazy learning approach to multi-label learning

Multi-Label Image Recognition With Graph Convolutional Networks

This work proposes a multi-label classification model based on Graph Convolutional Network (GCN), and proposes a novel re-weighted scheme to create an effective label correlation matrix to guide information propagation among the nodes in GCN.

HOT-VAE: Learning High-Order Label Correlation for Multi-Label Classification via Attention-Based Variational Autoencoders

A novel framework for multi-label classification, High-order Tie-in Variational Autoencoder (HOT-VAE), which performs adaptive high-order label correlation learning, which outperforms the existing state-of-the-art approaches on a bird distribution dataset.

Multilabel Classification With Group-Based Mapping: A Framework With Local Feature Selection and Local Label Correlation

This article proposes a novel framework with local feature selection and local label correlation, where instances can be clustered into different groups, and the feature selection weights and label correlations can only be shared by instances in the same group.

An extensive experimental comparison of methods for multi-label learning

Binary relevance efficacy for multilabel classification

Some interesting properties of BR are discussed, mainly that it produces optimal models for several ML loss functions, and the use of synthetic datasets to better analyze the behavior of ML methods in domains with different characteristics is proposed.

HARAM: A Hierarchical ARAM Neural Network for Large-Scale Text Classification

This work presents a well scalable extension to the fuzzy Adaptive Resonance Associative Map (ARAM) neural network which aims at increasing the classification speed by adding an extra ART layer for clustering learned prototypes into large clusters, leading to a significant reduction of the classification time.