We show the existence of a Locality-Sensitive Hashing (LSH) family for the angular distance that yields an approximate Near Neighbor Search algorithm with the asymptotically optimal running time… (More)

We study the effectiveness of learning low degree polynomials using neural networks by the gradient descent method. While neural networks have been shown to have great expressive power, and gradient… (More)

We show an optimal data-dependent hashing scheme for the approximate near neighbor problem. For an n-point dataset in a d-dimensional space our data structure achieves query time O(d ⋅ nρ+o(1)) and… (More)

2006 47th Annual IEEE Symposium on Foundations of…

2006

We investigate the optimality of (1+epsi)-approximation algorithms obtained via the dimensionality reduction method. We show that: any data structure for the (1 + epsi)-approximate nearest neighbor… (More)

Over the last decade, an immense amount of data has become available. From collections of photos, to genetic data, and to network traffic statistics, modern technologies and cheap storage have made… (More)

In this paper we consider the problem of finding the approximate nearest neighbor when the data set points are the substrings of a given text <i>T</i>. Specifically, for a string <i>T</i> of length… (More)

We give algorithms for geometric graph problems in the modern parallel models such as MapReduce. For example, for the Minimum Spanning Tree (MST) problem over a set of points in the two-dimensional… (More)

The “small scope hypothesis” argues that a high proportion of bugs can be found by testing the program for all test inputs within some small scope. In object-oriented pro grams, a test input is… (More)