Multi-Relation Message Passing for Multi-Label Text Classification

@article{Ozmen2022MultiRelationMP,
  title={Multi-Relation Message Passing for Multi-Label Text Classification},
  author={Muberra Ozmen and Hao Zhang and Pengyun Wang and Mark J. Coates},
  journal={ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  year={2022},
  pages={3583-3587}
}
  • Muberra OzmenHao Zhang M. Coates
  • Published 10 February 2022
  • Computer Science, Business
  • ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
A well-known challenge associated with the multi-label classification problem is modelling dependencies between labels. Most attempts at modelling label dependencies focus on co-occurrences, ignoring the valuable information that can be extracted by detecting label subsets that rarely occur together. For example, consider customer product reviews; a product probably would not simultaneously be tagged by both "recommended" (i.e., reviewer is happy and recommends the product) and "urgent" (i.e… 

Figures and Tables from this paper

GCN-BERT and Memory Network Based Multi-Label Classification for Event Text of the Chinese Government Hotline

  • B. Liu
  • Computer Science
    IEEE Access
  • 2022
Comparison experimental results show that the proposed framework outperforms all baselines and ablation studies demonstrate the effectiveness of each module.

References

SHOWING 1-10 OF 26 REFERENCES

Neural Message Passing for Multi-Label Classification

The proposed Label Message Passing (LaMP) Neural Networks to efficiently model the joint prediction of multiple labels are simple, accurate, interpretable, structure-agnostic, and applicable for predicting dense labels since LaMP is incredibly parallelizable.

Maximizing Subset Accuracy with Recurrent Neural Networks in Multi-label Classification

This paper replaces classifier chains with recurrent neural networks, a sequence-to-sequence prediction algorithm which has recently been successfully applied to sequential prediction tasks in many domains, and compares different ways of ordering the label set, and gives some recommendations on suitable ordering strategies.

ML-KNN: A lazy learning approach to multi-label learning

Multi-Label Image Recognition With Graph Convolutional Networks

This work proposes a multi-label classification model based on Graph Convolutional Network (GCN), and proposes a novel re-weighted scheme to create an effective label correlation matrix to guide information propagation among the nodes in GCN.

Multilabel Text Classification for Automated Tag Suggestion

This work has tried to model the automated tag suggestion problem as a multilabel text classification task in order to participate in the ECML/PKDD 2008 Discovery Challenge.

Rectifying Classifier Chains for Multi-Label Classification

This work analyzes the influence of a potential pitfall of the learning process, namely the discrepancy between the feature spaces used in training and testing, and proposes two modifications of classifier chains that are meant to overcome this problem.

Multilabel Classification With Group-Based Mapping: A Framework With Local Feature Selection and Local Label Correlation

This article proposes a novel framework with local feature selection and local label correlation, where instances can be clustered into different groups, and the feature selection weights and label correlations can only be shared by instances in the same group.

An extensive experimental comparison of methods for multi-label learning

Binary relevance efficacy for multilabel classification

Some interesting properties of BR are discussed, mainly that it produces optimal models for several ML loss functions, and the use of synthetic datasets to better analyze the behavior of ML methods in domains with different characteristics is proposed.

HARAM: A Hierarchical ARAM Neural Network for Large-Scale Text Classification

This work presents a well scalable extension to the fuzzy Adaptive Resonance Associative Map (ARAM) neural network which aims at increasing the classification speed by adding an extra ART layer for clustering learned prototypes into large clusters, leading to a significant reduction of the classification time.