• Publications
  • Influence
Detecting high log-densities: an O(n¼) approximation for densest k-subgraph
TLDR
An algorithm that for every ε> 0 approximates the Densest k-Subgraph problem within a ratio of n¼ + ε in time nO(1/ε), and an extension to this algorithm which achieves an O(n¼ -ε)-approximation in O(2nO(ε)) time.
Learning Mixtures of Ranking Models
TLDR
This work presents the first polynomial time algorithm which provably learns the parameters of a mixture of two Mallows models, and makes a novel use of tensor decomposition techniques to learn the top-k prefix in both the rankings.
On Learning Mixtures of Well-Separated Gaussians
TLDR
This work presents a computationally efficient algorithm in d=O(1) dimensions with only Ω(√{d}) separation, and extends the results to the case that components might have different weights and variances, which essentially characterize the optimal order of separation between components.
Smoothed analysis of tensor decompositions
TLDR
This work introduces a smoothed analysis model for studying generative models and develops an efficient algorithm for tensor decomposition in the highly overcomplete case (rank polynomial in the dimension) and shows that tensor products of perturbed vectors are linearly independent in a robust sense.
Approximating matrix p-norms
TLDR
It is proved that the problem of computing the norm of a matrix A, defined for <i>p, q</i> ≥ 1, is NP-hard to approximate to any constant factor, for 2 < i>n</i><sup>polylog(<i>n.</i>)</sup> and the problem cannot be approximated to a factor 2<sup>(log n)–1-ε</sup></sup>, for any constant ε > 0.
Learning Communities in the Presence of Errors
TLDR
This work considers graphs generated according to the Stochastic Block Model and then modified by an adversary and gives robust recovery algorithms for partial recovery in SBM with modeling errors or noise and shows that these algorithms work not only when the instances come from SBM, but also work when the instance come from any distribution of graphs that is close to SBM in the Kullback---Leibler divergence.
Bilu-Linial Stable Instances of Max Cut and Minimum Multiway Cut
TLDR
It is proved that there is no robust polynomial-time algorithm for γ-stable instances of Max Cut when γ < α SC(n/2), where αSC is the best approximation factor for Sparsest Cut with non-uniform demands, and it is shown that the standard SDP relaxation for Max Cut is integral if [EQUATION].
Beating the random assignment on constraint satisfaction problems of bounded degree
We show that for any odd k and any instance I of the max-kXOR constraint satisfaction problem, there is an efficient algorithm that finds an assignment satisfying at least a 1/2 + Omega(1/sqrt(D))
Approximation algorithms for semi-random partitioning problems
TLDR
A new semi-random model for graph partitioning problems that captures many properties of real-world instances and works in a wider range of parameters than most algorithms for previously studied random and semi- random models is proposed.
On Robustness to Adversarial Examples and Polynomial Optimization
TLDR
The main contribution of this work is to exhibit a strong connection between achieving robustness to adversarial examples, and a rich class of polynomial optimization problems, thereby making progress on the above questions.
...
...