• Corpus ID: 204800400

Decoupling Representation and Classifier for Long-Tailed Recognition

@article{Kang2020DecouplingRA,
  title={Decoupling Representation and Classifier for Long-Tailed Recognition},
  author={Bingyi Kang and Saining Xie and Marcus Rohrbach and Zhicheng Yan and Albert Gordo and Jiashi Feng and Yannis Kalantidis},
  journal={ArXiv},
  year={2020},
  volume={abs/1910.09217}
}
The long-tail distribution of the visual world poses great challenges for deep learning based classification models on how to handle the class imbalance problem. Existing solutions usually involve class-balancing strategies, e.g., by loss re-weighting, data re-sampling, or transfer learning from head- to tail-classes, but most of them adhere to the scheme of jointly learning representations and classifiers. In this work, we decouple the learning procedure into representation learning and… 

Distributional Robustness Loss for Long-tail Learning

TLDR
This work proposes a new loss based on robustness theory, which encourages the model to learn high-quality representations for both head and tail classes and finds that training with robustness increases recognition accuracy of tail classes while largely maintaining the accuracy of head classes.

Feature generation for long-tail classification

TLDR
This paper creates calibrated distributions to sample additional features that are subsequently used to train the classifier and establishes a new state-of-the-art approach that attempts to generate meaningful features by estimating the tail category's distribution.

Overcoming Classifier Imbalance for Long-Tail Object Detection With Balanced Group Softmax

  • Yu LiTao Wang Jiashi Feng
  • Computer Science
    2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2020
TLDR
This work provides the first systematic analysis on the underperformance of state-of-the-art models in front of long-tail distribution and proposes a novel balanced group softmax (BAGS) module for balancing the classifiers within the detection frameworks through group-wise training.

Self Supervision to Distillation for Long-Tailed Visual Recognition

TLDR
It is shown that soft label can serve as a powerful solution to incorporate label correlation into a multi-stage training scheme for long-tailed recognition, as well as a new distillation label generation module guided by self-supervision.

Constructing Balance from Imbalance for Long-tailed Image Recognition

TLDR
This work proposes a concise paradigm by progressively adjusting label space and dividing the head classes and tail classes, dynamically constructing balance from imbalance to facilitate the classification, which can boost the performance of state-of-the-arts of different types on widely-used benchmarks.

Improving Tail-Class Representation with Centroid Contrastive Learning

TLDR
Interpolative centroid contrastive learning (ICCL) is proposed to improve longtailed representation learning and shows a significant accuracy gain on the iNaturalist 2018 dataset with a real-world long-tailed distribution.

Class-Balanced Distillation for Long-Tailed Visual Recognition

TLDR
This work introduces a new training method, referred to as Class-Balanced Distillation (CBD), that leverages knowledge distillation to enhance feature representations and consistently outperforms the state of the art on long-tailed recognition benchmarks such as ImageNet-LT, iNaturalist17 and i naturalist18.

Long-tailed Recognition by Learning from Latent Categories

TLDR
It is hypothesized that common latent features among the head and tail classes can be used to give better feature represen- tation and introduced a Latent Categories based long-tail Recognition (LCReg) method, which is able to outperform previous methods and achieve state-of-the-art results.

Improving Calibration for Long-Tailed Recognition

TLDR
Motivated by the fact that predicted probability distributions of classes are highly related to the numbers of class instances, this work proposes label-aware smoothing to deal with different degrees of over-confidence for classes and improve classifier learning.
...

References

SHOWING 1-10 OF 38 REFERENCES

Learning Deep Representation for Imbalanced Classification

TLDR
The representation learned by this approach, when combined with a simple k-nearest neighbor (kNN) algorithm, shows significant improvements over existing methods on both high- and low-level vision classification tasks that exhibit imbalanced class distribution.

Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss

TLDR
A theoretically-principled label-distribution-aware margin (LDAM) loss motivated by minimizing a margin-based generalization bound is proposed that replaces the standard cross-entropy objective during training and can be applied with prior strategies for training with class-imbalance such as re-weighting or re-sampling.

Cost-Sensitive Learning of Deep Feature Representations From Imbalanced Data

TLDR
This paper proposes a cost-sensitive (CoSen) deep neural network, which can automatically learn robust feature representations for both the majority and minority classes, and shows that the proposed approach significantly outperforms the baseline algorithms.

Learning to Model the Tail

TLDR
Results on image classification datasets (SUN, Places, and ImageNet) tuned for the long-tailed setting, that significantly outperform common heuristics, such as data resampling or reweighting.

Large-Scale Long-Tailed Recognition in an Open World

TLDR
An integrated OLTR algorithm is developed that maps an image to a feature space such that visual concepts can easily relate to each other based on a learned metric that respects the closed-world classification while acknowledging the novelty of the open world.

Deep Imbalanced Learning for Face Recognition and Attribute Prediction

TLDR
Cluster-based Large Margin Local Embedding (CLMLE), when combined with a simple k-nearest cluster algorithm, shows significant improvements in accuracy over existing methods on both face recognition and face attribute prediction tasks that exhibit imbalanced class distribution.

Max-margin Class Imbalanced Learning with Gaussian Affinity

TLDR
This work introduces the first hybrid loss function that jointly performs classification and clustering in a single formulation based on an `affinity measure' in Euclidean space that leads to the following benefits: direct enforcement of maximum margin constraints on classification boundaries and flexibility to learn multiple class prototypes to support diversity and discriminability in feature space.

Striking the Right Balance With Uncertainty

TLDR
This paper demonstrates that the Bayesian uncertainty estimates directly correlate with the rarity of classes and the difficulty level of individual samples, and presents a novel framework for uncertainty based class imbalance learning that efficiently utilizes sample and class uncertainty information to learn robust features and more generalizable classifiers.

Exploring the Limits of Weakly Supervised Pretraining

TLDR
This paper presents a unique study of transfer learning with large convolutional networks trained to predict hashtags on billions of social media images and shows improvements on several image classification and object detection tasks, and reports the highest ImageNet-1k single-crop, top-1 accuracy to date.

Large Scale Fine-Grained Categorization and Domain-Specific Transfer Learning

TLDR
This work proposes a measure to estimate domain similarity via Earth Mover's Distance and demonstrates that transfer learning benefits from pre-training on a source domain that is similar to the target domain by this measure.