Corpus ID: 210472697

Bio-Inspired Hashing for Unsupervised Similarity Search

@inproceedings{Ryali2020BioInspiredHF,
  title={Bio-Inspired Hashing for Unsupervised Similarity Search},
  author={Chaitanya K. Ryali and John J. Hopfield and Leopold Grinberg and Dmitry Krotov},
  booktitle={ICML},
  year={2020}
}
The fruit fly Drosophila's olfactory circuit has inspired a new locality sensitive hashing (LSH) algorithm, FlyHash. In contrast with classical LSH algorithms that produce low dimensional hash codes, FlyHash produces sparse high-dimensional hash codes and has also been shown to have superior empirical performance compared to classical LSH algorithms in similarity search. However, FlyHash uses random projections and cannot learn from data. Building on inspiration from FlyHash and the ubiquity of… Expand
Procrustean Orthogonal Sparse Hashing
TLDR
It is proved that insect olfaction was shown to be structurally and functionally analogous to sparse hashing, and it is shown that orthogonality increases the accuracy of sparse hashing. Expand
Can a Fruit Fly Learn Word Embeddings?
TLDR
It is shown that not only can the fruit fly network motif achieve performance comparable to existing methods in NLP, but, additionally, it uses only a fraction of the computational resources (shorter training time and smaller memory footprint). Expand
Binary Codes Based on Non-Negative Matrix Factorization for Clustering and Retrieval
TLDR
This paper constructs an affinity graph to encode the geometrical structure of the original data, and the learned binary code subspace achieved by matrix factorization respects the structure. Expand
Algorithmic insights on continual learning from fruit flies
TLDR
This work discovered a two-layer neural circuit in the fruit fly olfactory system that addresses this challenge by uniquely combining sparse coding and associative learning, and shows that this simple and light-weight algorithm significantly boosts continual learning performance. Expand
High-Dimensional Sparse Cross-Modal Hashing with Fine-Grained Similarity Embedding
TLDR
An efficient sparse hashing method that not only takes the high-level semantic similarity of data into consideration, but also properly exploits the low-level feature similarity into the to-be-learnt hash codes, HSCH is presented. Expand
Parallel Training of Deep Networks with Local Updates
TLDR
This paper investigates how to continue scaling compute efficiently beyond the point of diminishing returns for large batches through local parallelism, a framework which parallelizes training of individual layers in deep networks by replacing global back Propagation with truncated layer-wise backpropagation. Expand

References

SHOWING 1-10 OF 64 REFERENCES
Improving Similarity Search with High-dimensional Locality-sensitive Hashing
TLDR
This work shows theoretically and empirically that this new family of hash functions is locality-sensitive and preserves rank similarity for inputs in any `p space, and proposes a multi-probe version of the algorithm that achieves higher performance for the same query time, or that maintains performance of prior approaches while taking significantly less indexing time and memory. Expand
A neural algorithm for a fundamental computing problem
TLDR
The fly’s olfac-tory circuit solves this problem using a novel variant of a traditional computer science algorithm (called locality-sensitive hashing), which helps illuminate the logic supporting an important sensory function (olfaction), and provides a conceptually new algorithm for solving a fundamental computational problem. Expand
Unsupervised Semantic Deep Hashing
TLDR
The proposed method, namely unsupervised semantic deep hashing (\textbf{USDH}), uses semantic information preserved in the CNN feature layer to guide the training of network to improve the usage of each bit in hashing codes by using maximum information entropy. Expand
Compressed Hashing
TLDR
This paper introduces as parse coding scheme, based on the approximation theory of integral operator, that generate sparse representation for high dimensional vectors that project s-parse codes into a low dimensional space by effectively exploring the Restricted Isometry Property, a key property in compressed sensing theory. Expand
Deep learning of binary hash codes for fast image retrieval
TLDR
This work proposes an effective deep learning framework to generate binary hash codes for fast image retrieval by employing a hidden layer for representing the latent concepts that dominate the class labels in convolutional neural networks. Expand
Semantic Structure-based Unsupervised Deep Hashing
TLDR
This work designs a deep architecture and a pair-wise loss function to preserve the semantic structure of the semantic relationships between points in unsupervised settings, and shows that SSDH significantly outperforms current state-of-the-art methods. Expand
Greedy Hash: Towards Fast Optimization for Accurate Hash Coding in CNN
TLDR
This work adopts the greedy principle to tackle this NP hard problem by iteratively updating the network toward the probable optimal discrete solution in each iteration, and provides a new perspective to visualize and understand the effectiveness and efficiency of the algorithm. Expand
Learning Deep Unsupervised Binary Codes for Image Retrieval
TLDR
The proposed DeepQuan model utilizes a deep autoencoder network, where the encoder is used to learn compact representations and the decoder is for manifold preservation, and learns the binary codes by minimizing the quantization error through product quantization technique. Expand
A neural data structure for novelty detection
TLDR
This work found that the fruit fly olfactory circuit evolved a variant of a Bloom filter to assess the novelty of odors, and develops a class of distance- and time-sensitive Bloom filters that outperform prior filters when evaluated on several biological and computational datasets. Expand
Olfactory representations by Drosophila mushroom body neurons.
TLDR
By comparing activity patterns evoked by the same odors across olfactory receptor neurons and across KCs, it is shown that representations of different odors do indeed become less correlated as they progress through the o aroma system. Expand
...
1
2
3
4
5
...