• Publications
  • Influence
Computational Complexity
  • S. Vadhan
  • Computer Science, Mathematics
    Encyclopedia of Cryptography and Security
  • 2005
TLDR
Computational complexity theory provides a foundation for most of modern cryptography, where the aim is to design cryptosystems that are “easy to use” but “hard to break”.
On the (im)possibility of obfuscating programs
TLDR
It is proved that obfuscation is impossible, by constructing a family of efficient programs that are unobfuscatable, in the sense that given any efficient program, the “source code” of that program can be efficiently reconstructed.
Verifiable random functions
TLDR
This work efficiently combines unpredictability and verifiability by extending the Goldreich-Goldwasser-Micali (1986) construction of pseudorandom functions f/sub s/ from a secret seed s to provide an NP-proof that the value f/ sub s/(x) is indeed correct without compromising the unpredictability of f/ Sub s/ at any other point for which no such a proof was provided.
Boosting and Differential Privacy
TLDR
This work obtains an $O(\eps^2) bound on the {\em expected} privacy loss from a single $\eps$-\dfp{} mechanism, and gets stronger bounds on the expected cumulative privacy loss due to multiple mechanisms, each of which provides $\eps-differential privacy or one of its relaxations, and each ofWhich operates on (potentially) different, adaptively chosen, databases.
Pseudorandom generators without the XOR Lemma
TLDR
Two different approaches are presented to proving the main result of Impagliazzo and Wigderson that if there exists a decision problem solvable in time 2/sup O(n)/ and having circuit complexity 2/Sup /spl Omega/(n)/ then P=BPP.
On the complexity of differentially private data release: efficient algorithms and hardness results
TLDR
Private data analysis in the setting in which a trusted and trustworthy curator releases to the public a "sanitization" of the data set that simultaneously protects the privacy of the individual contributors of data and offers utility to the data analyst is considered.
Entropy waves, the zig-zag graph product, and new constant-degree expanders and extractors
TLDR
The main contribution is a new type of graph product, which is the zig-zag product, giving the first explicit extractors whose seed length depends (poly)logarithmically on only the entropy deficiency of the source and that extract almost all the entropy of high min-entropy sources.
Finite Sample Differentially Private Confidence Intervals
TLDR
These algorithms guarantee a finite sample coverage, as opposed to an asymptotic coverage, and prove lower bounds on the expected size of any differentially private confidence set showing that the parameters are optimal up to polylogarithmic factors.
Proofs of Retrievability via Hardness Amplification
TLDR
The main insight of this work comes from a simple connection between PoR schemes and the notion of hardness amplification, and then building nearly optimal PoR codes using state-of-the-art tools from coding and complexity theory.
Computational Differential Privacy
TLDR
This work extends the dense model theorem of Reingold et al. to demonstrate equivalence between two definitions (indistinguishability- and simulatability-based) of computational differential privacy, and presents a differentially-private protocol for computing the distance between two vectors.
...
...