Dual Asymmetric Deep Hashing Learning

@article{Li2019DualAD,
  title={Dual Asymmetric Deep Hashing Learning},
  author={Jinxing Li and Bob Zhang and Guangming Lu and David Zhang},
  journal={IEEE Access},
  year={2019},
  volume={7},
  pages={113372-113384}
}
Due to the impressive learning power, deep learning has achieved a remarkable performance in supervised hash function learning. In this paper, we propose a novel asymmetric supervised deep hashing method to preserve the semantic structure among different categories and generate the binary codes simultaneously. Specifically, two asymmetric deep networks are constructed to reveal the similarity between each pair of images according to their semantic labels. Furthermore, since the binary codes in… 

Relaxed Asymmetric Deep Hashing Learning: Point-to-Angle Matching

TLDR
This article proposes a novel deep supervised hashing method by relaxing the matching between each pair of instances to a point-to-angle way, and introduces an inner product to asymmetrically measure the similarity and dissimilarity between the real-valued output and the binary code.

Semi-Paired Asymmetric Deep Cross-Modal Hashing Learning

TLDR
The proposed semi-paired asymmetric deep cross-model hashing (SADCH) is a novel asymmetric end-to-end deep neural network model that trains deep network by using query points to improve the training efficiency, and directly learns hash codes of database.

Deep High-order Asymmetric Supervised Hashing for Image Retrieval

TLDR
This work utilizes a powerful global covariance pooling module based on matrix power normalization to compute the second-order statistic features of input images, which is fluently embedded into an asymmetric hashing architecture in an end-to-end manner, leading to the generation of more discriminant binary hashing code.

DINGS FOR WEAKLY-SUPERVISED DEEP HASHING

  • Computer Science
  • 2019
TLDR
This work builds upon the idea of using semantic hierarchies to form distance metrics between all available sample labels to promote similar distances between the deep neural network embeddings, and introduces an empirical Kullback-Leibler divergence loss term to promote binarization and uniformity of theembeddings.

Learning to hash with semantic similarity metrics and empirical KL divergence

TLDR
Efficiency of the methods is demonstrated with semantic image retrieval on the CIFAR-100, ImageNet and Conceptual Captions datasets, using similarities inferred from the WordNet label hierarchy or sentence embeddings.

DCCH: Deep Continuous Center Hashing for Image Retrieval

TLDR
This work proposes a compact hash code learning method named DCCH (Deep Continuous Center Hashing), which learns hash representation with a well-specified loss function which adopts label information and spatial information to improve image hashing and video hashing.

SHREWD: Semantic Hierarchy-based Relational Embeddings for Weakly-supervised Deep Hashing

TLDR
This work builds upon the idea of using semantic hierarchies to form distance metrics between all available sample labels to promote similar distances between the deep neural network embeddings, and introduces an empirical Kullback-Leibler divergence loss term to promote binarization and uniformity of theembeddings.

SHREWD: SEMANTIC HIERARCHY-BASED RELATIONAL EMBED-

TLDR
This work builds upon the idea of using semantic hierarchies to form distance metrics between all available sample labels to promote similar distances between the deep neural network embeddings, and introduces an empirical Kullback-Leibler divergence loss term to promote binarization and uniformity of theembeddings.

Flexible Discrete Multi-view Hashing with Collective Latent Feature Learning

TLDR
An adaptive multi- view analysis dictionary learning model is developed to skillfully combine diverse representations into an established common latent feature space where the complementary properties of different views are well explored based on an automatic multi-view weighting strategy.

References

SHOWING 1-10 OF 38 REFERENCES

Deep Asymmetric Pairwise Hashing

TLDR
This work proposes a novel Deep Asymmetric Pairwise Hashing approach (DAPH) for supervised hashing, and devise an efficient alternating algorithm to optimize the asymmetric deep hash functions and high-quality binary code jointly.

Supervised hashing with kernels

TLDR
A novel kernel-based supervised hashing model which requires a limited amount of supervised information, i.e., similar and dissimilar data pairs, and a feasible training cost in achieving high quality hashing, and significantly outperforms the state-of-the-arts in searching both metric distance neighbors and semantically similar neighbors is proposed.

Deep hashing for compact binary codes learning

TLDR
A deep neural network is developed to seek multiple hierarchical non-linear transformations to learn compact binary codes for large scale visual search and shows the superiority of the proposed approach over the state-of-the-arts.

Simultaneous feature learning and hash coding with deep neural networks

TLDR
Extensive evaluations on several benchmark image datasets show that the proposed simultaneous feature learning and hash coding pipeline brings substantial improvements over other state-of-the-art supervised or unsupervised hashing methods.

AMVH: Asymmetric Multi-Valued hashing

TLDR
This work proposes an asymmetric multi-valued hashing method supported by two different non-binary embeddings that outperforms the existing binary hashing methods in search accuracy, but also retains their query and storage efficiency.

Feature Learning Based Deep Supervised Hashing with Pairwise Labels

TLDR
Experiments show that the proposed deep pairwise-supervised hashing method (DPSH), to perform simultaneous feature learning and hashcode learning for applications with pairwise labels, can outperform other methods to achieve the state-of-the-art performance in image retrieval applications.

Supervised Discrete Hashing

TLDR
This work proposes a new supervised hashing framework, where the learning objective is to generate the optimal binary hash codes for linear classification, and introduces an auxiliary variable to reformulate the objective such that it can be solved substantially efficiently by employing a regularization algorithm.

Fast Supervised Hashing with Decision Trees for High-Dimensional Data

TLDR
Experiments demonstrate that the proposed method significantly outperforms most state-of-the-art methods in retrieval precision and training time, and is orders of magnitude faster than many methods in terms of training time.

Deep Supervised Hashing for Fast Image Retrieval

TLDR
A novel Deep Supervised Hashing method to learn compact similarity-preserving binary code for the huge body of image data using pairs/triplets of images as training inputs and encouraging the output of each image to approximate discrete values.

Asymmetric Deep Supervised Hashing

TLDR
This paper proposes a novelDeep supervised hashing method, called asymmetric deep supervised hashing (ADSH), for large-scale nearest neighbor search, which treats the query points and database points in an asymmetric way.