• Corpus ID: 203902768

Self-Paced Multi-Label Learning with Diversity

@inproceedings{Seyedi2019SelfPacedML,
  title={Self-Paced Multi-Label Learning with Diversity},
  author={Seyed Amjad Seyedi and Siamak Ghodsi and Fardin Akhlaghian and Mahdi Jalili and Parham Moradi},
  booktitle={ACML},
  year={2019}
}
The major challenge of learning from multi-label data has arisen from the overwhelming size of label space which makes this problem NP-hard. This problem can be alleviated by gradually involving easy to hard tags into the learning process. Besides, the utilization of a diversity maintenance approach avoids overfitting on a subset of easy labels. In this paper, we propose a self-paced multi-label learning with diversity (SPMLD) which aims to cover diverse labels with respect to its learning pace… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 26 REFERENCES

Large-scale Multi-label Learning with Missing Labels

TLDR
This paper studies the multi-label problem in a generic empirical risk minimization (ERM) framework and develops techniques that exploit the structure of specific loss functions - such as the squared loss function - to obtain efficient algorithms.

A Review on Multi-Label Learning Algorithms

TLDR
This paper aims to provide a timely review on this area with emphasis on state-of-the-art multi-label learning algorithms with relevant analyses and discussions.

A Self-Paced Regularization Framework for Multilabel Learning

TLDR
This work proposes a new multilabel learning formulation by introducing a self-paced function as a regularizer, so as to simultaneously prioritize label learning tasks and instances in each iteration, and provides a general way to find the desiredSelf-paced functions.

Self-Paced Learning for Matrix Factorization

TLDR
This study presents a new MF learning methodology by gradually including matrix elements into MF training from easy to complex by following a recently proposed learning fashion called self-paced learning (SPL), which has been demonstrated to be beneficial in avoiding bad local minima.

Learning Low-Rank Label Correlations for Multi-label Classification with Missing Labels

TLDR
This paper proposes an integrated framework that learns the correlations among labels while training the multi-label model simultaneously, and incorporates a supplementary label matrix which augments the possibly incomplete label matrix by exploiting the label correlations.

ML-KNN: A lazy learning approach to multi-label learning

Multi-Label Learning by Exploiting Label Correlations Locally

TLDR
The ML-LOC approach which allows label correlations to be exploited locally, and derives a LOC code to enhance the feature representation of each instance to encode the local influence of label correlations.

Speedup Matrix Completion with Side Information: Application to Multi-Label Learning

TLDR
It is shown that, under appropriate conditions, with the assistance of side information matrices, the number of observed entries needed for a perfect recovery of matrix M can be dramatically reduced to O(ln n).

Self-Paced Boost Learning for Classification

TLDR
Self-Paced Boost Learning is proposed as a fully-corrective optimization for classification that is capable of capturing the intrinsic inter-class discriminative patterns while ensuring the reliability of the samples involved in learning.

A Unified View of Multi-Label Performance Measures

TLDR
A unified margin view to revisit eleven performance measures in multi-label classification is proposed and a max-margin approach called LIMO is designed and empirical results verify the theoretical findings.