Label Distribution Learning

  title={Label Distribution Learning},
  author={Xin Geng},
  journal={IEEE Transactions on Knowledge and Data Engineering},
  • Xin Geng
  • Published 7 December 2013
  • Computer Science
  • IEEE Transactions on Knowledge and Data Engineering
Although multi-label learning can deal with many problems with label ambiguity, it does not fit some real applications well where the overall distribution of the importance of the labels matters. This paper proposes a novel learning paradigm named label distribution learning (LDL) for such kind of applications. The label distribution covers a certain number of labels, representing the degree to which each label describes the instance. LDL is a more general learning framework which includes both… 

Figures and Tables from this paper

Label Enhancement for Label Distribution Learning

This paper proposes a novel LE algorithm called Graph Laplacian Label Enhancement (GLLE), which shows clear advantages over several existing LE algorithms and experimental results on eleven multi-label learning datasets validate the advantage of GLLE over the state-of-the-art multi- label learning approaches.

Label Distribution Learning by Regularized Sample Self-Representation

A regularized sample self-representation (RSSR) approach for LDL, whereby each label distribution can be represented as a linear combination of its relevant features, which can effectively identify the predictive label distribution and exhibit good performance in terms of distance and similarity evaluations.

Label distribution learning with label-specific features

This paper proposes a novel LDL algorithm by leveraging label-specific features, where the common features for all labels and specific features for each label are simultaneously learned to enhance the LDL model.

Filling Missing Labels in Label Distribution Learning by Exploiting Label-Specific Feature Selection

This work proposes an incomplete label distribution learning method to fill missing labels by exploiting label-specific feature selection, and uses the sparse learning to obtain the specific features of each class label, and the common features related to all class labels, respectively.

Label Distribution Learning with Label Correlations on Local Samples

Two novel label distribution learning algorithms are proposed by exploiting label correlations on local samples by exploiting the influence of local samples and a local correlation vector is designed as the additional features for each instance, which is based on the different clustered local samples.



Multi-Label Learning with Weak Label

The WELL (WEak Label Learning) method is proposed, which exploits the correlation between labels by assuming that there is a group of low-rank base similarities, and the appropriate similarities between instances for different labels can be derived from these base similarities.

Leveraging Implicit Relative Labeling-Importance Information for Effective Multi-label Learning

It is shown that effective multi-label learning can be achieved by leveraging the implicit relative labeling-importance (RLI) information, which is formalized as multinomial distribution over the label space, which are estimated by adapting an iterative label propagation procedure.

Scalable and efficient multi-label classification for evolving data streams

This paper proposes a new experimental framework for learning and evaluating on multi-label data streams, and uses it to study the performance of various methods, and develops a multi- Label Hoeffding tree with multi- label classifiers at the leaves.

Classifier chains for multi-label classification

This paper presents a novel classifier chains method that can model label correlations while maintaining acceptable computational complexity, and illustrates the competitiveness of the chaining method against related and state-of-the-art methods, both in terms of predictive performance and time complexity.

Multi-label ensemble based on variable pairwise constraint projection

Multi-Label Learning by Instance Differentiation

Applications to automatic web page categorization, natural scene classification and gene functional analysis show that the proposed novel multi-label learning approach named INS-DIF outperforms several well-established multi- label learning algorithms.

Label-Embedding for Attribute-Based Classification

This work proposes to view attribute-based image classification as a label-embedding problem: each class is embedded in the space of attribute vectors, and introduces a function which measures the compatibility between an image and a label embedding.

Correlated Label Propagation with Application to Multi-label Learning

A novel framework for multi-label learning termed Correlated Label Propagation (CLP) that explicitly models interactions between labels in an efficient manner and leads to significant gains in precision/recall against standard techniques on two real-world computer vision tasks involving several hundred labels.

Learning Classification with Auxiliary Probabilistic Information

This work study and develop a new framework for classification learning problem in which, in addition to class labels, the learner is provided with an auxiliary information that reflects how strong the expert feels about the class label.

Multilabel classification via calibrated label ranking

This work proposes a suitable extension of label ranking that incorporates the calibrated scenario and substantially extends the expressive power of existing approaches and suggests a conceptually novel technique for extending the common learning by pairwise comparison approach to the multilabel scenario, a setting previously not being amenable to the pairwise decomposition technique.