• Publications
  • Influence
A Simple but Tough-to-Beat Baseline for Sentence Embeddings
TLDR
We show that the following completely unsupervised sentence embedding is a formidable baseline: Use word embeddings computed using one of the popular methods on unlabeled corpus like Wikipedia, represent the sentence by a weighted average of the word vectors, and then modify them a bit using PCA/SVD. Expand
  • 785
  • 141
Polynomial time approximation schemes for Euclidean traveling salesman and other geometric problems
TLDR
We present a polynomial time approximation scheme for Euclidean TSP in fixed dimensions. Expand
  • 979
  • 108
  • PDF
The Multiplicative Weights Update Method: a Meta-Algorithm and Applications
TLDR
This project was supported by David and Lucile Packard Fellowship and NSF grants MSPA-MCS 0528414 and CCR0205594. Expand
  • 707
  • 85
  • PDF
Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks
TLDR
This paper analyzes training and generalization for a simple 2-layer ReLU net with random initialization, and provides the following improvements over recent works: (i) Using a tighter characterization of training speed than recent papers, an explanation for why training a neural network with random labels leads to slower training, as originally observed in [Zhang et al. (2017). Expand
  • 342
  • 72
  • PDF
A Practical Algorithm for Topic Modeling with Provable Guarantees
TLDR
We present an algorithm for topic model inference that is both provable and practical and produces results comparable to the best MCMC implementations. Expand
  • 331
  • 72
  • PDF
On Exact Computation with an Infinitely Wide Neural Net
TLDR
This paper gives the first efficient exact algorithm for computing the extension of NTK to convolutional neural nets, which we call Convolutional NTK, as well as an efficient GPU implementation of this algorithm. Expand
  • 282
  • 71
  • PDF
Proof verification and hardness of approximation problems
TLDR
The class PCP(f(n),g(n)) consists of all languages L for which there exists a polynomial-time probabilistic oracle machine that used O(f) random bits of its oracle and behaves as follows: If x in L then there exists an oracle y such that the machine accepts for all random choices but if x not in L the machine rejects with high probability. Expand
  • 1,330
  • 66
  • PDF
Stronger generalization bounds for deep nets via a compression approach
TLDR
Deep nets generalize well despite having more parameters than the number of training samples. Expand
  • 323
  • 63
  • PDF
Proof verification and the hardness of approximation problems
TLDR
We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. Expand
  • 1,056
  • 58
  • PDF
Probabilistic checking of proofs; a new characterization of NP
  • Sanjeev Arora, S. Safra
  • Mathematics, Computer Science
  • Proceedings., 33rd Annual Symposium on…
  • 24 October 1992
TLDR
The authors give a new characterization of NP: the class NP contains exactly those languages L for which membership proofs (a proof that an input x is in L) can be verified probabilistically in polynomial time using logarithmic number of random bits and sub-logarithms of queries to the proof. Expand
  • 543
  • 54
...
1
2
3
4
5
...