Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data

@article{Khan2015CostSensitiveLO,
  title={Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data},
  author={Salman Hameed Khan and Munawar Hayat and Bennamoun and Ferdous Sohel and Roberto B. Togneri},
  journal={IEEE Transactions on Neural Networks and Learning Systems},
  year={2015},
  volume={29},
  pages={3573-3587}
}
Class imbalance is a common problem in the case of real-world object detection and classification tasks. [] Key Method During training, our learning procedure jointly optimizes the class-dependent costs and the neural network parameters. The proposed approach is applicable to both binary and multiclass problems without any modification. Moreover, as opposed to data-level approaches, we do not alter the original data distribution, which results in a lower computational cost during the training process. We…

Learning Deep Representation for Imbalanced Classification

The representation learned by this approach, when combined with a simple k-nearest neighbor (kNN) algorithm, shows significant improvements over existing methods on both high- and low-level vision classification tasks that exhibit imbalanced class distribution.

Generative Adversarial Networks for Improving Imbalanced Classification Performance

This work studies the detrimental effects of class imbalance on the classification performance of the Hybrid CNNSVM architecture, and proposes the use of a modified architecture of Generative Adversarial Networks (GAN), called Wasserstein GAN with Gradient Penalty (WGAN-GP) to generate new data samples.

Deep Imbalanced Learning for Face Recognition and Attribute Prediction

Cluster-based Large Margin Local Embedding (CLMLE), when combined with a simple k-nearest cluster algorithm, shows significant improvements in accuracy over existing methods on both face recognition and face attribute prediction tasks that exhibit imbalanced class distribution.

Review of Methods for Handling Class-Imbalanced in Classification Problems

The article examines the most widely used methods for addressing the problem of learning with a class imbalance, including data-level, algorithm- level, hybrid, cost-sensitive learning, and deep learning, etc. including their advantages and limitations.

Imbalanced Classification via Feature Dictionary-based Minority Oversampling

This work proposes the feature dictionary-based generative model for the oversampling method, and demonstrates that the proposed model achieved the highest top-1 performance on various public fashion datasets.

A Composite Cost-Sensitive Neural Network for Imbalanced Classification

  • Lei ChenYuan Zhu
  • Computer Science
    2020 39th Chinese Control Conference (CCC)
  • 2020
A specifically-designed cost-sensitive matrix, which is composed of example-dependent costs and class- dependent costs, is embedded into the loss function to improve the classification performance and indicates that the CCS-DNN performs better than other baseline methods.

Cost-Sensitive Convolution based Neural Networks for Imbalanced Time-Series Classification

An adaptive cost-sensitive learning strategy was proposed to modify temporal deep learning models and can be used to modify deep learning classifiers from cost-insensitive to cost- sensitive to address ITSC issues.

Class-Imbalanced Deep Learning via a Class-Balanced Ensemble

A new loss function is designed that can rectify the bias toward the majority classes by forcing the CNN’s hidden layers and its associated auxiliary classifiers to focus on the samples that have been misclassified by previous layers, thus enabling subsequent layers to develop diverse behavior and fix the errors of previous layers in a batch-wise manner.

Gaussian Affinity for Max-Margin Class Imbalanced Learning

This work introduces the first hybrid loss function that jointly performs classification and clustering in a single formulation based on an `affinity measure' in Euclidean space that leads to the following benefits: direct enforcement of maximum margin constraints on classification boundaries and flexibility to learn multiple class prototypes to support diversity and discriminability in feature space.
...

References

SHOWING 1-10 OF 72 REFERENCES

Towards Effective Classification of Imbalanced Data with Convolutional Neural Networks

An algorithmic solution, integrating different methods into a novel approach using a class-to-class separability score, to increase performance on poorly separable, imbalanced datasets using Cost Sensitive Neural Networks is presented.

Training deep neural networks on imbalanced data sets

A novel loss function called mean false error together with its improved version mean squared false error are proposed for the training of deep networks on imbalanced data sets and demonstrate the superiority of the proposed approach compared with conventional methods in classifying im balanced data sets on deep neural networks.

Learning classifiers from imbalanced data based on biased minimax probability machine

This paper proposes a novel model named biased minimax probability machine, which directly controls the worst-case real accuracy of classification of the future data to build up biased classifier and provides a rigorous treatment on imbalanced data.

Class-Boundary Alignment for Imbalanced Dataset Learning

The class-boundaryalignment algorithm is proposed to augment SVMs to deal with imbalanced training-data problems posed by many emerging applications (e.g., image retrieval, video surveillance, and gene profiling).

Deeply-Supervised Nets

The proposed deeply-supervised nets (DSN) method simultaneously minimizes classification error while making the learning process of hidden layers direct and transparent, and extends techniques from stochastic gradient methods to analyze the algorithm.

Classification of Imbalanced Data by Combining the Complementary Neural Network and SMOTE Algorithm

The proposed method combines Synthetic Minority Over-sampling Technique (SMOTE) and Complementary Neural Network (CMTNN) to handle the problem of classifying imbalanced data and shows that the proposed combination techniques can improve the performance for the class imbalance problem.

Cost-Aware Pre-Training for Multiclass Cost-Sensitive Deep Learning

A novel cost-aware algorithm is proposed that takes into account the cost information into not only the training stage but also the pre-training stage of deep learning, which allows deep learning to conduct automatic feature extraction with the cost Information effectively.

An iterative method for multi-class cost-sensitive learning

This paper empirically evaluates the performance of the proposed method using benchmark data sets and proves that the method generally achieves better results than representative methods for cost-sensitive learning, in terms of predictive performance (cost minimization) and, in many cases, computational efficiency.

CNN Features Off-the-Shelf: An Astounding Baseline for Recognition

A series of experiments conducted for different recognition tasks using the publicly available code and model of the OverFeat network which was trained to perform object classification on ILSVRC13 suggest that features obtained from deep learning with convolutional nets should be the primary candidate in most visual recognition tasks.

A study of the behavior of several methods for balancing machine learning training data

This work performs a broad experimental evaluation involving ten methods, three of them proposed by the authors, to deal with the class imbalance problem in thirteen UCI data sets, and shows that, in general, over-sampling methods provide more accurate results than under-sampled methods considering the area under the ROC curve (AUC).
...