New Directions In Testing

  title={New Directions In Testing},
  author={Richard J. Lipton},
  booktitle={Distributed Computing And Cryptography},
  • R. Lipton
  • Published in
    Distributed Computing And…
  • Computer Science
Average-case intractability vs. worst-case intractability
Contracting projected entangled pair states is average-case hard
It is shown that an accurate evaluation of normalization or expectation values of PEPS is as hard to compute for typical instances as for special configurations of highest computational hardness.
Assessment of Damage in Concrete Structures Using Acoustic Emission
Permanent is hard to compute even on a good day
We give an exposition of Cai, Pavan and Sivakumar’s result on the hardness of permanent. They show that assuming it is hard to compute the permanent in the worst case, an algorithm cannot compute the
A (de)constructive approach to program checking
This work introduces a novel composition methodology for improving the efficiency of program checkers and designs a variety of programCheckers that are provably more efficient, in terms of circuit depth, than the optimal program for computing the function being checked.
Locally random reductions: Improvements and applications
A cryptographic application is given, showing a new way to prove in perfect zero knowledge that committed bitsx1,...,xm satisfy some predicateQ, regardless of the computational complexity ofQ.
The power of adaptiveness and additional queries in random-self-reductions
This work looks at relationships between adaptive and nonadaptive random-self-reductions from a structural complexity-theoretic point of view and shows that there exist sets that are adaptively random- self-reducible but not nonadaptively Random-Self-Reducible.
Advances in Cryptology-CRYPTO’ 90
A new type of cryptanalytic attack is developed which can breakDES with up to eight rounds in a few minutes on a PC and can break DES withUp to 15 rounds faster than an exhaustive search.
List decoding: algorithms and applications
The list-decoding problem, the algorithms that have been developed, and a diverse collection of applications within complexity theory are described.
On the hardness of computing the permanent of random matrices
It is shown that unless the polynomial-time hierarchy collapses to its second level, noPolynomial time algorithm can compute the permanent of every matrix with probability at least 13n3/p, nor can it compute the Permanent of at least a 49n^3 /\sqrt p -fraction of the matrices.