Learn More
We give a new characterization of NP: the class NP contains exactly those languages <italic>L</italic> for which membership proofs (a proof that an input <italic>x</italic> is in <italic>L</italic>) can be verified probabilistically in polynomial time using <italic>logarithmic</italic> number of random bits and by reading <italic>sublogarithmic</italic>(More)
We present a polynomial time approximation scheme for Euclidean TSP in fixed dimensions. For every fixed <italic>c</italic> &gt; 1 and given any <italic>n</italic> nodes in <inline-equation><f><sc>R</sc></f> </inline-equation><supscrpt>2</supscrpt>, a randomized version of the scheme finds a (1 + 1/<italic>c</italic>)-approximation to the optimum traveling(More)
We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a <italic>constant</italic> number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probability 1 (i.e., for every choice of its random(More)
We give a O(&#8730;log n)-approximation algorithm for <sc>sparsest cut</sc>, <sc>balanced separator</sc>, and <sc>graph conductance</sc> problems. This improves the O(log n)-approximation of Leighton and Rao (1988). We use a well-known semidefinite relaxation with triangle inequality constraints. Central to our analysis is a geometric theorem about(More)
The class PCP(f(n), g(n)) consists of all languages L for which there exists a polynomial-time probabilistic oracle machine that uses O(f(n)) random bits, queries O(g(n)) bits of its oracle and behaves as follows: If x ∈ L then there exists an oracle y such that the machine accepts for all random choices but if x 6∈ L then for every oracle y the machine(More)
Algorithms in varied fields use the idea of maintaining a distribution over a certain set and use the multiplicative update rule to iteratively change these weights. Their analysis are usually very similar and rely on an exponential potential function. We present a simple meta algorithm that unifies these disparate algorithms and drives them as simple(More)
Topic models provide a useful method for dimensionality reduction and exploratory data analysis in large text corpora. Most approaches to topic model learning have been based on a maximum likelihood objective. Efficient algorithms exist that attempt to approximate this objective, but they have no provable guarantees. Recently, algorithms have been(More)