• Publications
  • Influence
An elementary proof of a theorem of Johnson and Lindenstrauss
A result of Johnson and Lindenstrauss [13] shows that a set of n points in high dimensional Euclidean space can be mapped into an O(log n/e2)-dimensional Euclidean space such that the distanceExpand
Robust Submodular Observation Selection
In many applications, one has to actively select among a set of expensive observations before making an informed decision. For example, in environmental monitoring, we want to select locations toExpand
Provisioning a virtual private network: a network design problem for multicommodity flow
TLDR
This work establishes a relation between this collection of network design problems and a variant of the facility location problem introduced by Karger and Minkoff, and provides optimal and approximate algorithms for several variants of this problem, depending on whether the traffic matrix is required to be symmetric. Expand
Bounded geometries, fractals, and low-distortion embeddings
TLDR
This work considers both general doubling metrics, as well as more restricted families such as those arising from trees, from graphs excluding a fixed minor, and from snowflaked metrics, which contains many families of metrics that occur in applied settings. Expand
An elementary proof of the Johnson-Lindenstrauss Lemma
The Johnson-Lindenstrauss lemma shows that a set of n points in high dimensional Euclidean space can be mapped down into an O(log n== 2) dimensional Euclidean space such that the distance between anyExpand
Constrained Non-monotone Submodular Maximization: Offline and Secretary Algorithms
TLDR
These ideas are extended to give a simple greedy-based constant factor algorithms for non-monotone submodular maximization subject to a knapsack constraint, and for (online) secretary setting subject to uniform matroid or a partition matroid constraint. Expand
Better Algorithms for Stochastic Bandits with Adversarial Corruptions
TLDR
A new algorithm is presented whose regret is nearly optimal, substantially improving upon previous work and can tolerate a significant amount of corruption with virtually no degradation in performance. Expand
Approximate clustering without the approximation
TLDR
If any c-approximation to the given clustering objective φ is e-close to the target, then this paper shows that this guarantee can be achieved for any constant c > 1, and for the min-sum objective the authors can do this for any Constant c > 2. Expand
When LP Is the Cure for Your Matching Woes: Improved Bounds for Stochastic Matchings
TLDR
A generalization of the stochastic online matching problem that also models preference-uncertainty and timeouts of buyers, and gives a constant factor approximation algorithm. Expand
Simpler and better approximation algorithms for network design
TLDR
A simple and easy-to-analyze randomized approximation algorithms for several well-studied NP-hard network design problems and a simple constant-factor approximation algorithm for the single-sink buy-at-bulk network design problem. Expand
...
1
2
3
4
5
...