#### Filter Results:

- Full text PDF available (104)

#### Publication Year

1997

2017

- This year (11)
- Last 5 years (50)
- Last 10 years (94)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

In this paper we propose a novel method for learning a Mahalanobis distance measure to be used in the KNN classification algorithm. The algorithm directly maximizes a stochastic variant of the leave-one-out KNN score on the training set. It can also learn a low-dimensional linear embedding of labeled data that can be used for data visualization and fast… (More)

- Jacob Goldberger, Shiri Gordon, Hayit Greenspan
- ICCV
- 2003

In this work we present two new methods for approximating the Kullback-Liebler (KL) divergence between two mixtures of Gaussians. The first method is based on matching between the Gaussian elements of the two Gaussian mixture densities. The second method is based on the unscented transform. The proposed methods are utilized for image retrieval tasks.… (More)

- Jonathan Berant, Ido Dagan, Jacob Goldberger
- ACL
- 2011

Extensive knowledge bases of entailment rules between predicates are crucial for applied semantic inference. In this paper we propose an algorithm that utilizes transitivity constraints to learn a globally-optimal set of entailment rules for typed predicates. We model the task as a graph learning problem and suggest methods that scale the algorithm to… (More)

- Jacob Goldberger, Sam T. Roweis
- NIPS
- 2004

In this paper we propose an efficient algorithm for reducing a large mixture of Gaussians into a smaller mixture while still preserving the component structure of the original model; this is achieved by clustering (grouping) the components. The method minimizes a new, easily computed distance measure between two Gaussian mixtures that can be motivated from… (More)

- Hayit Greenspan, Jacob Goldberger, Arnaldo Mayer
- IEEE Transactions on Pattern Analysis and Machine…
- 2003

In this paper, we describe a statistical video representation and modeling scheme. Video representation schemes are needed to segment a video stream into meaningful video-objects, useful for later indexing and retrieval applications. In the proposed methodology, unsupervised clustering via Gaussian mixture modeling extracts coherent space-time regions in… (More)

- Jacob Goldberger, Hagai Aronowitz
- INTERSPEECH
- 2005

This paper proposes a dissimilarity measure between two Gaussian mixture models (GMM). Computing a distance measure between two GMMs that were learned from speech segments is a key element in speaker verification, speaker segmentation and many other related applications. A natural measure between two distributions is the Kullback-Leibler divergence.… (More)

- Lev Faivishevsky, Jacob Goldberger
- ICML
- 2010

In this paper we propose a novel clustering algorithm based on maximizing the mutual information between data points and clusters. Unlike previous methods, we neither assume the data are given in terms of distributions nor impose any parametric model on the within-cluster distribution. Instead, we utilize a non-parametric estimation of the average cluster… (More)

- Hayit Greenspan, Jacob Goldberger, Lenny Ridel
- Computer Vision and Image Understanding
- 2001

In this paper we describe a probabilistic image matching scheme in which the im age representation is continuous and the similarity measure and distance computation are also de ned in the continuous domain Each image is rst represented as a Gaus sian mixture distribution and images are compared and matched via a probabilistic measure of similarity between… (More)

- Oren Melamud, Ido Dagan, Jacob Goldberger
- HLT-NAACL
- 2015

Context representations are a key element in distributional models of word meaning. In contrast to typical representations based on neighboring words, a recently proposed approach suggests to represent a context of a target word by a substitute vector, comprising the potential fillers for the target word slot in that context. In this work we first propose a… (More)

- Oren Melamud, Jacob Goldberger, Ido Dagan
- CoNLL
- 2016

Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, coreference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context… (More)