How to recycle random bits

  title={How to recycle random bits},
  author={Russell Impagliazzo and David Zuckerman},
  journal={30th Annual Symposium on Foundations of Computer Science},
  • R. Impagliazzo, D. Zuckerman
  • Published 30 October 1989
  • Computer Science, Mathematics
  • 30th Annual Symposium on Foundations of Computer Science
It is shown that modified versions of the linear congruential generator and the shift register generator are provably good for amplifying the correctness of a probabilistic algorithm. More precisely, if r random bits are needed for a BPP algorithm to be correct with probability at least 2/3, then O(r+k/sup 2/) bits are needed to improve this probability to 1-2/sup -k/. A different pseudorandom generator that is optimal, up to a constant factor, in this regard is also presented. It uses only O(r… 

Deterministic extractors for bit-fixing sources and exposure-resilient cryptography

  • Jesse KampD. Zuckerman
  • Mathematics, Computer Science
    44th Annual IEEE Symposium on Foundations of Computer Science, 2003. Proceedings.
  • 2003
An efficient deterministic algorithm which extracts almost-random bits from sources where n/sup 1/2 + /spl gamma// of the n bits are uniformly random and the rest are fixed in advance is given.

Impossibility results for recycling random bits in two-prover proof systems

The great success enjoyed by general techniques for recycling random bits in other contexts meets its limits when MIP(2, 1) proof systems are concerned, and parallel repetition using pseudo- random bits cannot reduce the error below a constant, regardless of the nature of the pseudo-random source.

How to Privatize Random Bits

The existence of strong randomized hard functions and pseudo-random generators are shown and it is shown that relative to a random oracle P/poly is not measurable in $EXP$ in the resource-bounded theoretical sense and a very strong separation between sublinear time and $AC^0$ is obtained.

Randomness-optimal oblivious sampling

  • D. Zuckerman
  • Computer Science, Mathematics
    Random Struct. Algorithms
  • 1997
This work presents the first efficient oblivious sampler that uses an optimal number of random bits, up to an arbitrary constant factor bigger than 1, and gives applications to constructive leader election and reducing randomness in interactive proofs.

Randomness-optimal sampling, extractors, and constructive leader election

A constructive O(log n) round protocol for leader election in the full information model that is resilient against any coalition of size /3n for any constant ~ < 1/2 and gives two applications of these tools.

On the Power of the Randomized Iterate

This paper revisits a technique that was used in to give a construction of pseudorandom generators from regular one-way functions and uses the randomized iterate to replace the basic building block of the [HILL99] construction.

Recycling random bits in parallel

  • K. FriedlShi-Chun Tsai
  • Computer Science, Mathematics
    Proceedings of the Twenty-Eighth Annual Hawaii International Conference on System Sciences
  • 1995
Shows that r pseudo-random bits can be obtained by concatenating t blocks of r/t pseudo-random bits, where the blocks are generated in parallel. This can be considered as a parallel version of R.

Deterministic amplification of space-bounded probabilistic algorithms

  • Ziv Bar-YossefOded GoldreichA. Wigderson
  • Computer Science
    Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)
  • 1999
It is proved that any black-box amplification method that uses O(r) random bits and makes at most p parallel simulations reduces the error to at most /spl epsiv//sup O(p)/.

A pseudorandom generator construction based on randomness extractors and combinatorial designs

This work studies the construction of pseudorandom generators, and uses an observation of Sudan et al. to recast the Impagliazzo-Wigderson construction in terms of weak sources of randomness; such a source is a distribution on binary strings that is "random" in the sense of having high "entropy".

Randomness-Efficient Sampling within NC1

A randomness-efficient averaging sampler that is computable by uniform constant-depth circuits with parity gates is constructed, allowing us to apply a variety expander-based techniques within NC1.



How to generate cryptographically strong sequences of pseudo random bits

  • M. BlumS. Micali
  • Computer Science, Mathematics
    23rd Annual Symposium on Foundations of Computer Science (sfcs 1982)
  • 1982
A more operative definition of Randomness should be pursued in the light of modern Complexity Theory.

Realistic analysis of some randomized algorithms

  • E. Bach
  • Computer Science, Mathematics
    J. Comput. Syst. Sci.
  • 1991
The results below apply to sequences generated by iteratively applying functions of the form ƒ = &agr;&khgr; + &bgr; (mod p) to a randomly chosen seed x, and estimate the probability that a predetermined number of trials of an algorithm will fail.

Deterministic simulation in LOGSPACE

In this paper we show that a wide class of probabilistic algorithms can be simulated by deterministic algorithms. Namely if there is a test in LOGSPACE so that a random sequence of length (log n)2 /

Pseudo-random generation from one-way functions

From one-way functions of type (1) or (2) it is shown how to construct pseudo-random generators secure against small circuits or fast algorithms, respectively, and vice-versa.

Randomized algorithms and pseudorandom numbers

This work assumes that a (small) random seed is available to start up a simple pseudorandom number generator which is then used for the randomized algorithm.

Linear Congruential Generators Do Not Produce Random Sequences

This paper discusses the predictability of the sequence given only a constant proportion /spl alpha/ of the leading bits of the first few numbers generated, and shows that the rest of the sequences is predictable in polynomial time, almost always.

Universal Classes of Hash Functions

Dispersers, deterministic amplification, and weak random sources

The use of highly expanding bipartite multigraphs (called dispersers) to reduce greatly the error of probabilistic algorithms at the cost of few additional random bits is treated. Explicit

Theory and application of trapdoor functions

  • A. Yao
  • Computer Science, Mathematics
    23rd Annual Symposium on Foundations of Computer Science (sfcs 1982)
  • 1982
A new information theory is introduced and the concept of trapdoor functions is studied and applications of such functions in cryptography, pseudorandom number generation, and abstract complexity theory are examined.