# Supervised Hashing with Soft Constraints

@article{Leng2014SupervisedHW, title={Supervised Hashing with Soft Constraints}, author={Cong Leng and Jian Cheng and Jiaxiang Wu and Xi Sheryl Zhang and Hanqing Lu}, journal={Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management}, year={2014} }

Due to the ability to preserve semantic similarity in Hamming space, supervised hashing has been extensively studied recently. Most existing approaches encourage two dissimilar samples to have maximum Hamming distance. This may lead to an unexpected consequence that two unnecessarily similar samples would have the same code if they are both dissimilar with another sample. Besides, in existing methods, all labeled pairs are treated with equal importance without considering the semantic gap… Expand

#### 9 Citations

Supervised discrete hashing through similarity learning

- Computer Science
- Multimedia Tools and Applications
- 2020

Different from the existing supervised hashing methods that learn hash codes from least-squares classification by regressing the hash codes to their corresponding labels, this work leverage the mutual relation between different semantic labels to learn more stable hash codes. Expand

Scalable Supervised Discrete Hashing for Large-Scale Search

- Computer Science
- WWW
- 2018

Scalable Supervised Discrete Hashing is presented, a novel hashing method that bypasses the direct optimization on the n by n pairwise similarity matrix and adopts no relaxation optimization scheme in the learning procedure and avoids the large quantization error problem. Expand

Sampling based Discrete Supervised Hashing

- 2015

By leveraging semantic (label) information, supervised hashing has demonstrated better accuracy than unsupervised hashing in many real applications. Because the hashing-code learning problem is… Expand

Column Sampling Based Discrete Supervised Hashing

- Computer Science
- AAAI
- 2016

A novel method, called column sampling based discrete supervised hashing (COSDISH), to directly learn the discrete hashing code from semantic information and can outperform the state-of-the-art methods in real applications like image retrieval. Expand

Sequential Compact Code Learning for Unsupervised Image Hashing

- Mathematics, Medicine
- IEEE Transactions on Neural Networks and Learning Systems
- 2016

A novel unsupervised framework, termed evolutionary compact embedding (ECE), is introduced to automatically learn the task-specific binary hash codes, which can be regarded as an optimization algorithm that combines the genetic programming (GP) and a boosting trick. Expand

Error Correcting Input and Output Hashing

- Computer Science, Medicine
- IEEE Transactions on Cybernetics
- 2019

This paper proposes a novel framework, error correcting input and output (EC-IO) coding, which does class-level and instance-level encoding under a unified mapping space, and presents the hashing model, EC-IOH, by approximating the mapping space with the Hamming space. Expand

Asymmetric Deep Supervised Hashing

- Mathematics, Computer Science
- AAAI
- 2018

This paper proposes a novelDeep supervised hashing method, called asymmetric deep supervised hashing (ADSH), for large-scale nearest neighbor search, which treats the query points and database points in an asymmetric way. Expand

Partial Hash Update via Hamming Subspace Learning

- Computer Science, Medicine
- IEEE Transactions on Image Processing
- 2017

A method for Hamming subspace learning based on greedy selection strategy and the Distribution Preserving Hamming Subspace learning (DHSL) algorithm by designing a novel loss function to improve the speed of online updating and the performance of hashing algorithm. Expand

Learning Short Binary Codes for Large-scale Image Retrieval

- Computer Science, Medicine
- IEEE Transactions on Image Processing
- 2017

This paper proposes a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes for scalable image retrieval tasks, which can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval. Expand

#### References

SHOWING 1-8 OF 8 REFERENCES

Supervised hashing with kernels

- Computer Science
- 2012 IEEE Conference on Computer Vision and Pattern Recognition
- 2012

A novel kernel-based supervised hashing model which requires a limited amount of supervised information, i.e., similar and dissimilar data pairs, and a feasible training cost in achieving high quality hashing, and significantly outperforms the state-of-the-arts in searching both metric distance neighbors and semantically similar neighbors is proposed. Expand

Learning to Hash with Binary Reconstructive Embeddings

- Computer Science, Mathematics
- NIPS
- 2009

An algorithm for learning hash functions based on explicitly minimizing the reconstruction error between the original distances and the Hamming distances of the corresponding binary embeddings is developed. Expand

A General Two-Step Approach to Learning-Based Hashing

- Computer Science
- 2013 IEEE International Conference on Computer Vision
- 2013

This framework decomposes the hashing learning problem into two steps: hash bit learning and hash function learning based on the learned bits, and can typically be formulated as binary quadratic problems, and the second step can be accomplished by training standard binary classifiers. Expand

Comparing apples to oranges: a scalable solution with heterogeneous hashing

- Computer Science
- KDD
- 2013

This paper proposes a novel Relation-aware Heterogeneous Hashing (RaHH), which provides a general framework for generating hash codes of data entities sitting in multiple heterogeneous domains and encodes both homogeneous and heterogeneous relationships between the data entities to design hash functions with improved accuracy. Expand

Distance Metric Learning with Application to Clustering with Side-Information

- Computer Science, Mathematics
- NIPS
- 2002

This paper presents an algorithm that, given examples of similar (and, if desired, dissimilar) pairs of points in �”n, learns a distance metric over ℝn that respects these relationships. Expand

Iterative quantization: A procrustean approach to learning binary codes

- Mathematics, Computer Science
- CVPR 2011
- 2011

A simple and efficient alternating minimization scheme for finding a rotation of zero- centered data so as to minimize the quantization error of mapping this data to the vertices of a zero-centered binary hypercube is proposed. Expand

Minimal Loss Hashing for Compact Binary Codes

- Mathematics, Computer Science
- ICML
- 2011

The formulation is based on structured prediction with latent variables and a hinge-like loss function that is efficient to train for large datasets, scales well to large code lengths, and outperforms state-of-the-art methods. Expand

Similarity estimation techniques from rounding algorithms

- Mathematics, Computer Science
- STOC '02
- 2002

It is shown that rounding algorithms for LPs and SDPs used in the context of approximation algorithms can be viewed as locality sensitive hashing schemes for several interesting collections of objects. Expand