• Corpus ID: 52290043

Maximum-Entropy Fine-Grained Classification

@inproceedings{Dubey2018MaximumEntropyFC,
  title={Maximum-Entropy Fine-Grained Classification},
  author={Abhimanyu Dubey and Otkrist Gupta and Ramesh Raskar and Nikhil Naik},
  booktitle={Neural Information Processing Systems},
  year={2018}
}
Fine-Grained Visual Classification (FGVC) is an important computer vision problem that involves small diversity within the different classes, and often requires expert annotators to collect data. Utilizing this notion of small visual diversity, we revisit Maximum-Entropy learning in the context of fine-grained classification, and provide a training routine that maximizes the entropy of the output probability distribution for training convolutional neural networks on FGVC tasks. We provide a… 

Figures and Tables from this paper

Fine-Grained Visual Classification using Self Assessment Classifier

A Self Assessment Classifier, which simultaneously leverages the representation of the image and top-k prediction classes to reassess the classi-cation results, and achieves new state-of-the-art results on CUB200-2011, Stanford Dog, and FGVC Aircraft datasets.

Preserving Fine-Grain Feature Information in Classification via Entropic Regularization

This paper introduces an entropy-based regularization to promote more diversity in the feature space of trained models, and empirically demonstrate the efficacy of this methodology to reach better performance on the fine-grain problems.

Exploiting Category Similarity-Based Distributed Labeling for Fine-Grained Visual Classification

A simple yet effective approach named category similarity-based distributed labeling (CSDL) to tackle the problem of diversity within subcategories of fine-grained visual classification (FGVC).

Label-Smooth Learning for Fine-Grained Visual Categorization

This paper proposes a label-smooth learning method that improves models applicability to large categories by maximizing its prediction diversity and demonstrates its comparable or state-of-the-art performance on five benchmark datasets.

The Effectiveness of Noise in Data Augmentation for Fine-Grained Image Classification

A simple approach that leverages large amounts of noisy images from the Web for fine-grained image classification that achieves comparable results on benchmark datasets, e.g., CUB-Birds, Stanford Dogs, and Stanford Cars, with only 50 augmented noisy samples for every category.

A sparse focus framework for visual fine-grained classification

A Sparse Focus Framework based on Bilinear Convolutional Neural Network, which includes self-focus module and sparse scaling factors, which can obtain a sparse structure to prevent overfitting while maintaining classification performance.

Progressive Learning of Category-Consistent Multi-Granularity Features for Fine-Grained Visual Classification

This paper proposes a progressive training strategy that effectively fuses features from different granularities, and a consistent block convolution that encourages the network to learn the category-consistent features at specificgranularities.

Fine-Grained Image Classification Based on Cross-Attention Network

A multi-scale and multi-level ViT model that utilizes the results of the previous layer of ViT to improve the accuracy of fine-grained image classification and is competitive with current mainstream state-of-the-art methods on multiple datasets.

Attention Convolutional Binary Neural Tree for Fine-Grained Visual Categorization

An attention convolutional binary neural tree architecture is presented to address problems for weakly supervised Fine-grained visual categorization and uses the attention transformer module to enforce the network to capture discriminative features.

Exploring Vision Transformers for Fine-grained Classification

This work proposes a multi-stage ViT framework for fine-grained image classification tasks, which localizes the informative image regions without requiring architectural changes using the inherent multi-head self-attention mechanism and introduces attention-guided augmentations for improving the model’s capabilities.
...

References

SHOWING 1-10 OF 52 REFERENCES

Low-Rank Bilinear Pooling for Fine-Grained Classification

This work proposes a classifier co-decomposition that factorizes the collection of bilinear classifiers into a common factor and compact per-class terms and achieves state-of-the-art performance on several public datasets for fine-grained classification trained with only category labels.

The Unreasonable Effectiveness of Noisy Data for Fine-Grained Recognition

This work introduces an alternative approach, leveraging free, noisy data from the web and simple, generic methods of recognition, and demonstrates its efficacy on four fine-grained datasets, greatly exceeding existing state of the art without the manual collection of even a single label.

Generalized Orderless Pooling Performs Implicit Salient Matching

This paper generalizes average and bilinear pooling to “α-pooling”, allowing for learning the pooling strategy during training, and presents a novel way to visualize decisions made by these approaches.

Mining Discriminative Triplets of Patches for Fine-Grained Classification

This work introduces triplets of patches with geometric constraints to improve the accuracy of patch localization, and automatically mine discriminative geometrically-constrained triplets for classification in a patch-based framework that only requires object bounding boxes.

Kernel Pooling for Convolutional Neural Networks

This work demonstrates how to approximate kernels such as Gaussian RBF up to a given order using compact explicit feature maps in a parameter-free manner and proposes a general pooling framework that captures higher order interactions of features in the form of kernels.

Hierarchical Joint CNN-Based Models for Fine-Grained Cars Recognition

A novel approach focussed on two main aspects, the most discriminative local feature representations of regions of interests (ROIs) magnified many details and the hierarchical relations within the fine-grained categories can be simulated by probability formulas.

Picking Deep Filter Responses for Fine-Grained Image Recognition

This paper proposes an automatic fine-grained recognition approach which is free of any object / part annotation at both training and testing stages, and conditionally pick deep filter responses to encode them into the final representation, which considers the importance of filter responses themselves.

Bilinear CNN Models for Fine-Grained Visual Recognition

We propose bilinear models, a recognition architecture that consists of two feature extractors whose outputs are multiplied using outer product at each location of the image and pooled to obtain an

Improving the Fisher Kernel for Large-Scale Image Classification

In an evaluation involving hundreds of thousands of training images, it is shown that classifiers learned on Flickr groups perform surprisingly well and that they can complement classifier learned on more carefully annotated datasets.

Very Deep Convolutional Networks for Large-Scale Image Recognition

This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.
...