Embarrassingly Simple Binary Representation Learning

@article{Shen2019EmbarrassinglySB,
  title={Embarrassingly Simple Binary Representation Learning},
  author={Yuming Shen and Jie Qin and Jiaxin Chen and Li Liu and Fan Zhu},
  journal={2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW)},
  year={2019},
  pages={2883-2892}
}
  • Yuming Shen, Jie Qin, +2 authors F. Zhu
  • Published 2019
  • Computer Science
  • 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW)
Recent binary representation learning models usually require sophisticated binary optimization, similarity measure or even generative models as auxiliaries. However, one may wonder whether these non-trivial components are needed to formulate practical and effective hashing models. In this paper, we answer the above question by proposing an embarrassingly simple approach to binary representation learning. With a simple classification objective, our model only incorporates two additional fully… Expand
Auto-Encoding Twin-Bottleneck Hashing
  • Yuming Shen, Jie Qin, +5 authors L. Shao
  • Computer Science
  • 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2020
TLDR
This paper proposes an efficient and adaptive code-driven graph, which is updated by decoding in the context of an auto-encoder, and introduces into the framework twin bottlenecks that exchange crucial information collaboratively. Expand
On Learning Semantic Representations for Million-Scale Free-Hand Sketches
TLDR
A dual-branch CNNRNN network architecture is proposed to represent sketches, which simultaneously encodes both the static and temporal patterns of sketch strokes and proposes two sketch-specific deep models, which outperform the state-of-the-art competitors. Expand
One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective
TLDR
It is shown that maximizing the cosine similarity between the continuous codes and their corresponding binary orthogonal codes can ensure both hash code discriminativeness and quantization error minimization, leading to an one-loss deep hashing model that removes all the hassles of tuning the weights of various losses. Expand
Shuffle and Learn: Minimizing Mutual Information for Unsupervised Hashing
TLDR
A novel relaxation method called Shuffle and Learn is proposed to tackle code conflicts in the unsupervised hash, which can relax the code optimization from local optimum and help to generate binary representations that are more discriminative and informative without any annotations. Expand
Leveraging Quadratic Spherical Mutual Information Hashing for Fast Image Retrieval
  • N. Passalis, A. Tefas
  • Computer Science
  • 2020 25th International Conference on Pattern Recognition (ICPR)
  • 2021
TLDR
The proposed deep supervised hashing algorithm is adapted to the needs of large-scale hashing and information retrieval leading to a novel information-theoretic measure, the Quadratic Spherical Mutual Information (QSMI), that leads to significant better retrieval precision. Expand
eep supervised hashing using quadratic spherical mutual information for fficient image retrieval
Several deep supervised hashing techniques have been proposed to allow for extracting compact and efficient neural network representations for various tasks. However, many deep supervised hashingExpand
Ternary Hashing
TLDR
This work demonstrates that the proposed ternary hashing is compared favorably to the binary hashing methods with consistent improvements of retrieval mean average precision (mAP) ranging from 1% to 5.9% as shown in CIFAR10, NUS-WIDE and ImageNet100 datasets. Expand
BiDet: An Efficient Binarized Object Detector
TLDR
This paper proposes a binarized neural network learning method called BiDet that generalizes the information bottleneck (IB) principle to object detection, where the amount of information in the high-level feature maps is constrained and the mutual information between the feature maps and object detection is maximized. Expand
A Decade Survey of Content Based Image Retrieval using Deep Learning
TLDR
This paper presents a comprehensive survey of deep learning based developments in the past decade for content based image retrieval, covering different supervision, different networks, different descriptor type and different retrieval type. Expand

References

SHOWING 1-10 OF 52 REFERENCES
Fast Training of Triplet-Based Deep Binary Embedding Networks
TLDR
This paper proposes to formulate high-order binary codes learning as a multi-label classification problem by explicitly separating learning into two interleaved stages and proposes to map the original image to compact binary codes via carefully designed deep convolutional neural networks and the hashing function fitting can be solved by training binary CNN classifiers. Expand
Unsupervised Binary Representation Learning with Deep Variational Networks
TLDR
The conditional auto-encoding variational Bayesian networks are introduced in this work to exploit the feature space structure of the training data using the latent variables, and the proposed DVB model estimates the statistics of data representations, and thus produces compact binary codes. Expand
Binary Generative Adversarial Networks for Image Retrieval
TLDR
This paper uses binary generative adversarial networks (BGAN) to embed images to binary codes in an unsupervised way and proposes new sign-activation strategy and a loss function steering the learning process, which consists of new models for adversarial loss, a content loss, and a neighborhood structure loss. Expand
SuBiC: A Supervised, Structured Binary Code for Image Search
TLDR
This work proposes herein a novel method to make deep convolutional neural networks produce supervised, compact, structured binary codes for visual search that outperforms state-of-the-art compact representations based on deep hashing or structured quantization in single and cross-domain category retrieval, instance retrieval and classification. Expand
Relaxation-Free Deep Hashing via Policy Gradient
TLDR
This approach formulates the non-smooth part of the hashing network as sampling with a stochastic policy, so that the retrieval performance degradation caused by the relaxation can be avoided and the differentiation challenge for discrete optimization can be naturally addressed. Expand
Supervised hashing with kernels
TLDR
A novel kernel-based supervised hashing model which requires a limited amount of supervised information, i.e., similar and dissimilar data pairs, and a feasible training cost in achieving high quality hashing, and significantly outperforms the state-of-the-arts in searching both metric distance neighbors and semantically similar neighbors is proposed. Expand
Zero-Shot Sketch-Image Hashing
TLDR
ZSIH is the first zero- shot hashing work suitable for SBIR and cross-modal search and forms a generative hashing scheme in reconstructing semantic knowledge representations for zero-shot retrieval. Expand
Deep hashing for compact binary codes learning
TLDR
A deep neural network is developed to seek multiple hierarchical non-linear transformations to learn compact binary codes for large scale visual search and shows the superiority of the proposed approach over the state-of-the-arts. Expand
Stochastic Generative Hashing
TLDR
This paper proposes a novel generative approach to learn hash functions through Minimum Description Length principle such that the learned hash codes maximally compress the dataset and can also be used to regenerate the inputs. Expand
Learning to Hash with Binary Deep Neural Network
TLDR
This work proposes deep network models and learning algorithms for unsupervised and supervised binary hashing that incorporate independence and balance properties in the direct and strict forms in the learning and includes similarity preserving property in the objective function. Expand
...
1
2
3
4
5
...