• Corpus ID: 233481137

GistNet: a Geometric Structure Transfer Network for Long-Tailed Recognition

@article{Liu2021GistNetAG,
  title={GistNet: a Geometric Structure Transfer Network for Long-Tailed Recognition},
  author={Bo Liu and Haoxiang Li and Hao Kang and Gang Hua and Nuno Vasconcelos},
  journal={ArXiv},
  year={2021},
  volume={abs/2105.00131}
}
The problem of long-tailed recognition, where the number of examples per class is highly unbalanced, is considered. It is hypothesized that the well known tendency of standard classifier training to overfit to popular classes can be exploited for effective transfer learning. Rather than eliminating this overfitting, e.g. by adopting popular classbalanced sampling methods, the learning algorithm should instead leverage this overfitting to transfer geometric information from popular to low-shot… 
1 Citations

Figures and Tables from this paper

Deep Long-Tailed Learning: A Survey
TLDR
A comprehensive survey on recent advances in deep long-tailed learning is provided, highlighting important applications of deepLongtailed learning and identifying several promising directions for future research.

References

SHOWING 1-10 OF 35 REFERENCES
Dynamic Few-Shot Visual Learning Without Forgetting
TLDR
This work proposes to extend an object recognition system with an attention based few-shot classification weight generator, and to redesign the classifier of a ConvNet model as the cosine similarity function between feature representations and classification weight vectors.
ImageNet: A large-scale hierarchical image database
TLDR
A new database called “ImageNet” is introduced, a large-scale ontology of images built upon the backbone of the WordNet structure, much larger in scale and diversity and much more accurate than the current image datasets.
Focal Loss for Dense Object Detection
TLDR
This paper proposes to address the extreme foreground-background class imbalance encountered during training of dense detectors by reshaping the standard cross entropy loss such that it down-weights the loss assigned to well-classified examples, and develops a novel Focal Loss, which focuses training on a sparse set of hard examples and prevents the vast number of easy negatives from overwhelming the detector during training.
Range Loss for Deep Face Recognition with Long-Tailed Training Data
TLDR
This paper investigated how long-tailed data impact the training of face CNNs and develop a novel loss function, called range loss, to effectively utilize the tailed data in training process, and demonstrates the effectiveness of the proposed range loss in overcoming the long tail effect.
Deep Metric Learning via Lifted Structured Feature Embedding
TLDR
An algorithm for taking full advantage of the training batches in the neural network training by lifting the vector of pairwise distances within the batch to the matrix of Pairwise distances enables the algorithm to learn the state of the art feature embedding by optimizing a novel structured prediction objective on the lifted problem.
BBN: Bilateral-Branch Network With Cumulative Learning for Long-Tailed Visual Recognition
TLDR
A unified Bilateral-Branch Network (BBN) is proposed to take care of both representation learning and classifier learning simultaneously, where each branch does perform its own duty separately.
Decoupling Representation and Classifier for Long-Tailed Recognition
TLDR
It is shown that it is possible to outperform carefully designed losses, sampling strategies, even complex modules with memory, by using a straightforward approach that decouples representation and classification.
Deep Representation Learning on Long-Tailed Data: A Learnable Embedding Augmentation Perspective
TLDR
This paper proposes to augment each instance of the tail classes with certain disturbances in the deep feature space to alleviate the distortion of the learned feature space, and improves deep representation learning on long tailed data.
Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification
TLDR
This paper proposes a novel self-paced knowledge distillation framework, termed Learning From Multiple Experts (LFME), inspired by the observation that networks trained on less imbalanced subsets of the distribution often yield better performances than their jointly-trained counterparts.
Long-Tailed Recognition Using Class-Balanced Experts
TLDR
This work addresses the problem of long-tailed recognition wherein the training set is highly imbalanced and the test set is kept balanced, and proposes an ensemble of class-balanced experts that combines the strength of diverse classifiers.
...
1
2
3
4
...