• Publications
  • Influence
Computational Complexity: A Modern Approach
TLDR
This beginning graduate textbook describes both recent achievements and classical results of computational complexity theory and can be used as a reference for self-study for anyone interested in complexity. Expand
Generalization and equilibrium in generative adversarial nets (GANs) (invited talk)
Generative Adversarial Networks (GANs) have become one of the dominant methods for fitting generative models to complicated real-life data, and even found unusual uses such as designing goodExpand
A Theoretical Analysis of Contrastive Unsupervised Representation Learning
TLDR
This framework allows us to show provable guarantees on the performance of the learned representations on the average classification task that is comprised of a subset of the same set of latent classes and shows that learned representations can reduce (labeled) sample complexity on downstream tasks. Expand
Harnessing the Power of Infinitely Wide Deep Nets on Small-data Tasks
TLDR
Results suggesting neural tangent kernels perform strongly on low-data tasks are reported, with comparing the performance of NTK with the finite-width net it was derived from, NTK behavior starts at lower net widths than suggested by theoretical analysis. Expand
An Exponential Learning Rate Schedule for Deep Learning
TLDR
The first time such a rate schedule has been successfully used, let alone for highly successful architectures, is suggested, and as expected, such training rapidly blows up network weights, but the net stays well-behaved due to normalization. Expand
InstaHide: Instance-hiding Schemes for Private Distributed Learning
TLDR
InstaHide, a simple encryption of training images, which can be plugged into existing distributed deep learning pipelines is introduced, which is efficient and applying it during training has minor effect on test accuracy. Expand
Towards Strong Nonapproximability Results in the Lovász-Schrijver Hierarchy
TLDR
It is shown that the relaxations produced by as many as Ω(n) rounds of the LS+ procedure do not allow nontrivial approximation, thus ruling out the possibility that the LS- approach gives even slightly subexponential approximation algorithms for well-known problems such as max-3sat, hypergraph vertex cover and set cover. Expand
On the Ability of Neural Nets to Express Distributions
TLDR
This work takes a first cut at explaining the expressivity of multilayer nets by giving a sufficient criterion for a function to be approximable by a neural network with n hidden layers. Expand
Approximation Algorithms for Geometric TSP
In the Euclidean traveling salesman problem, we are given n nodes in ℝ2 (more generally, in ℝd and desire the minimum cost salesman tour for these nodes, where the cost of the edge between nodesExpand
...
1
2
3
4
5
...