Combining instance-based learning and logistic regression for multilabel classification

@article{Cheng2009CombiningIL,
  title={Combining instance-based learning and logistic regression for multilabel classification},
  author={Weiwei Cheng and Eyke H{\"u}llermeier},
  journal={Machine Learning},
  year={2009},
  volume={76},
  pages={211-225}
}
Multilabel classification is an extension of conventional classification in which a single instance can be associated with multiple labels. Recent research has shown that, just like for conventional classification, instance-based learning algorithms relying on the nearest neighbor estimation principle can be used quite successfully in this context. However, since hitherto existing algorithms do not take correlations and interdependencies between labels into account, their potential has not yet… Expand
A Dependent Multilabel Classification Method Derived from the k-Nearest Neighbor Rule
TLDR
This paper describes an original method for multilabel classification problems derived from a Bayesian version of the k-nearest neighbor (k-NN) rule, which takes into account the dependencies between labels. Expand
Multilevel Classification Exploiting Coupled Label Similarity with Feature Selection
TLDR
The experiments indicate that proposed CML-kNN with feature selection method achieves superior performance than existing CML -kNN method. Expand
Multilabel classification with meta-level features
TLDR
Control experiments show strong empirical evidence for the strength of the proposed approach, as it significantly outperformed several state-of-the-art methods, including Rank-SVM, ML-kNN and IBLR-ML (Instance-based Logistic Regression for Multi-label Classification) in most cases. Expand
A probabilistic methodology for multilabel classification
TLDR
A generic methodology that can improve the results obtained by a set of independent probabilistic binary classifiers, by using a combination procedure with a classifier trained on the co-occurrences of the labels. Expand
Graded Multilabel Classification by Pairwise Comparisons
TLDR
This paper proposes to reformulate the problem of multilabel classification in terms of preferences between the labels and their scales, which can then be tackled by learning from pair wise comparisons and shows that its solution outperforms baseline approaches. Expand
LI-MLC: A Label Inference Methodology for Addressing High Dimensionality in the Label Space for Multilabel Classification
TLDR
The purpose of this paper is to analyze dimensionality in the label space in MLDs, and to present a transformation methodology based on the use of association rules to discover label dependencies, resulting in a statistically significant improvement of performance in some cases. Expand
Efficient pairwise multilabel classification
TLDR
This thesis presents a framework of efficient and scalable solutions for handling hundreds or thousands of labels despite the quadratic dependency, and focuses particularly on the pairwise decomposition of the original problem in which a decision function is learned for each possible pair of classes. Expand
An Incremental Decision Tree for Mining Multilabel Data
TLDR
An incremental decision tree to reduce the learning time and divide the training data and adopt the k-NN classifier at leaves to improve the classification accuracy and draw a conclusion that the algorithm can efficiently learn from multilabel data while maintaining good performance on example-based evaluation metrics. Expand
ABC-based stacking method for multilabel classification
TLDR
A novel stacking-based ensemble algorithm, ABC-based stacking, for multilabel classification, using the artificial bee colony algorithm along with a single-layer artificial neural network to find suitable meta-level classifier configurations is presented. Expand
An efficient probabilistic framework for multi-dimensional classification
TLDR
This paper proposes a new probabilistic approach that represents class conditional dependencies in an effective yet computationally efficient way using a special tree-structured Bayesian network model to represent the conditional joint distribution of the class variables given the feature variables. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 34 REFERENCES
ML-KNN: A lazy learning approach to multi-label learning
TLDR
Experiments on three different real-world multi-label learning problems, i.e. Yeast gene functional analysis, natural scene classification and automatic web page categorization, show that ML-KNN achieves superior performance to some well-established multi- label learning algorithms. Expand
Multilabel Neural Networks with Applications to Functional Genomics and Text Categorization
TLDR
Applications to two real-world multilabel learning problems, i.e., functional genomics and text categorization, show that the performance of BP-MLL is superior to that of some well-established multILabel learning algorithms. Expand
Collective multi-label classification
TLDR
Experiments show that the models outperform their single-label counterparts on standard text corpora and improve subset classification error by as much as 40% when multi-labels are sparse. Expand
Discriminative Methods for Multi-labeled Classification
TLDR
A new technique for combining text features and features indicating relationships between classes, which can be used with any discriminative algorithm is presented, which beat accuracy of existing methods with statistically significant improvements. Expand
Multi-Instance Multi-Label Learning with Application to Scene Classification
TLDR
This paper formalizes multi-instance multi-label learning, where each training example is associated with not only multiple instances but also multiple class labels, and proposes the MIMLBOOST and MIMLSVM algorithms which achieve good performance in an application to scene classification. Expand
Decision trees for hierarchical multi-label classification
TLDR
HMC trees outperform HSC and SC trees along three dimensions: predictive accuracy, model size, and induction time, and it is concluded that HMC trees should definitely be considered in HMC tasks where interpretable models are desired. Expand
Parametric Mixture Models for Multi-Labeled Text
TLDR
It is shown that the proposed probabilistic generative models, called parametric mixture models (PMMs), could significantly outperform the conventional binary methods when applied to multi-labeled text categorization using real World Wide Web pages. Expand
Instance-Based Learning Algorithms
TLDR
This paper describes how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy and extends the nearest neighbor algorithm, which has large storage requirements. Expand
Learning multi-label scene classification
TLDR
A framework to handle semantic scene classification, where a natural scene may contain multiple objects such that the scene can be described by multiple class labels, is presented and appears to generalize to other classification problems of the same nature. Expand
A kernel method for multi-labelled classification
TLDR
This article presents a Support Vector Machine like learning system to handle multi-label problems, based on a large margin ranking system that shares a lot of common properties with SVMs. Expand
...
1
2
3
4
...