• Publications
  • Influence
Information-theoretic metric learning
TLDR
We present an information-theoretic approach to learning a Mahalanobis distance function that can handle a wide variety of constraints and can optionally incorporate a prior on the distance function. Expand
  • 1,731
  • 281
  • PDF
Adapting Visual Category Models to New Domains
TLDR
We present a method that adapts object models acquired in a particular visual domain to new imaging conditions by learning a transformation that minimizes the effect of domain-induced changes in the feature distribution. Expand
  • 1,423
  • 269
  • PDF
Learning to Hash with Binary Reconstructive Embeddings
TLDR
We develop an algorithm for learning hash functions based on explicitly minimizing the reconstruction error between the original distances and the Hamming distances of the corresponding binary embeddings. Expand
  • 740
  • 133
  • PDF
Kernelized locality-sensitive hashing for scalable image search
  • B. Kulis, K. Grauman
  • Mathematics, Computer Science
  • IEEE 12th International Conference on Computer…
  • 1 December 2009
TLDR
We generalize locality-sensitive hashing to accommodate arbitrary kernel functions, making it possible to preserve the algorithm's sub-linear time similarity search guarantees for a wide class of useful similarity functions. Expand
  • 823
  • 94
  • PDF
Weighted Graph Cuts without Eigenvectors A Multilevel Approach
TLDR
We discuss an equivalence between the objective functions used in these seemingly different methods - in particular, a general weighted kernel k-means objective is mathematically equivalent to a weighted graph clustering objective. Expand
  • 779
  • 82
  • PDF
Kernel k-means: spectral clustering and normalized cuts
TLDR
We show the generality of the weighted kernel k-means objective function, and derive the spectral clustering objective of normalized cut as a special case. Expand
  • 1,067
  • 81
  • PDF
What you saw is not what you get: Domain adaptation using asymmetric kernel transforms
TLDR
In this paper, we address the problem of visual domain adaptation for transferring object models from one dataset or visual domain to another. Expand
  • 621
  • 73
  • PDF
Kernelized Locality-Sensitive Hashing
  • B. Kulis, K. Grauman
  • Mathematics, Medicine
  • IEEE Transactions on Pattern Analysis and Machine…
  • 1 June 2012
TLDR
We generalize locality-sensitive hashing to accommodate arbitrary kernel functions, making it possible to preserve the algorithm's sublinear time similarity search guarantees for a wide class of useful similarity functions. Expand
  • 304
  • 54
  • PDF
Revisiting k-means: New Algorithms via Bayesian Nonparametrics
TLDR
We show that a Gibbs sampling algorithm for the Dirichlet process mixture approaches a hard clustering algorithm in the limit, and further that the resulting algorithm monotonically minimizes an elegant underlying k-means-like clustering objective that includes a penalty. Expand
  • 302
  • 45
  • PDF
Semi-supervised graph clustering: a kernel approach
TLDR
We unify vector-based and graph-based semi-supervised clustering algorithms and show that our algorithm is able to outperform current state-of-the-art models on both vector and graph data sets. Expand
  • 253
  • 38