Gradient-based Label Binning in Multi-label Classification

@article{Rapp2021GradientbasedLB,
  title={Gradient-based Label Binning in Multi-label Classification},
  author={Michael Rapp and Eneldo Loza Menc{\'i}a and Johannes F{\"u}rnkranz and Eyke H{\"u}llermeier},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.11690}
}
In multi-label classification, where a single example may be associated with several class labels at the same time, the ability to model dependencies between labels is considered crucial to effectively optimize non-decomposable evaluation measures, such as the Subset 0/1 loss. The gradient boosting framework provides a well-studied foundation for learning models that are specifically tailored to such a loss function and recent research attests the ability to achieve high predictive accuracy in… Expand
1 Citations

Figures and Tables from this paper

References

SHOWING 1-10 OF 28 REFERENCES
Learning Gradient Boosted Multi-label Classification Rules
TLDR
This work develops a generalization of the gradient boosting framework to multi-output problems and proposes an algorithm for learning multi-label classification rules that is able to minimize decomposable as well as non-decomposable loss functions. Expand
On label dependence and loss minimization in multi-label classification
TLDR
It is claimed that two types of label dependence should be distinguished, namely conditional and marginal dependence, and three scenarios in which the exploitation of one of these types of dependence may boost the predictive performance of a classifier are presented. Expand
Classifier Chains for Multi-label Classification
TLDR
Empirical evaluation over a broad range of multi-label datasets with a variety of evaluation metrics demonstrates the competitiveness of the chaining method against related and state-of-the-art methods, both in terms of predictive performance and time complexity. Expand
Compressed labeling on distilled labelsets for multi-label learning
TLDR
Theoretically, the recovery bounds of CL are proved which verifies the effectiveness of CL for label compression and multi-label classification performance improvement brought by label correlations preserved in DLs. Expand
Group Preserving Label Embedding for Multi-Label Classification
TLDR
The embedding of labels together with the group information with an objective to build an efficient multi-label classifier is studied and the superiority of the proposed method over state-of-art algorithms for multi- label learning is compared. Expand
Sparse Local Embeddings for Extreme Multi-label Classification
TLDR
The SLEEC classifier is developed for learning a small ensemble of local distance preserving embeddings which can accurately predict infrequently occurring (tail) labels and can make significantly more accurate predictions then state-of-the-art methods including both embedding-based as well as tree-based methods. Expand
Cost-sensitive label embedding for multi-label classification
TLDR
The proposed algorithm, cost-sensitive label embedding with multidimensional scaling (CLEMS), approximates the cost information with the distances of the embedded vectors by using the classic multiddimensional scaling approach for manifold learning. Expand
Gradient Boosted Decision Trees for High Dimensional Sparse Output
TLDR
This paper studies the gradient boosted decision trees (GBDT) when the output space is high dimensional and sparse, and proposes a new GBDT variant, GBDT-SPARSE, to resolve this problem by employing L0 regularization. Expand
A Review on Multi-Label Learning Algorithms
TLDR
This paper aims to provide a timely review on this area with emphasis on state-of-the-art multi-label learning algorithms with relevant analyses and discussions. Expand
Online Boosting Algorithms for Multi-label Ranking
TLDR
This work designs online boosting algorithms with provable loss bounds for multi-label ranking and designs an adaptive algorithm that does not require knowledge of the edge of the weak learners and is hence more practical. Expand
...
1
2
3
...