Independent Unbiased Coin Flips From a Correlated Biased Source: a Finite State Markov Chain

@inproceedings{Blum1984IndependentUC,
  title={Independent Unbiased Coin Flips From a Correlated Biased Source: a Finite State Markov Chain},
  author={Manuel Blum},
  booktitle={FOCS},
  year={1984}
}
  • M. Blum
  • Published in FOCS 1984
  • Mathematics, Computer Science
von Neumann's trick for generating an absolutely unbiased coin from a biased one is this: 1. Toss the biased coin twice, getting 00, 01, 10, or 11. 2. If 00 or 11 occur, go back to step 1; else 3. Call 10 a H, 01 a T. Since p[H] = p[1]*p[0] = p[T], the output is unbiased. Example: 00 10 11 01 01 /spl I.oarr/ H T T. Peter Elias gives an algorithm to generate an independent unbiased sequence of Hs and Ts that nearly achieves the Entropy of the one-coin source. His algorithm is excellent, but… Expand
How to turn loaded dice into fair coins
TLDR
A new generalization of von Neumann's algorithm distinguished by its high level of practicality and amenability to analysis is described, and it is able to prove that in an asymptotic sense the algorithm extracts the full entropy of its input. Expand
Efficient Generation of Random Bits From Finite State Markov Chains
TLDR
This paper generalize Blum's algorithm to arbitrary degree finite Markov chains and combine it with Elias's method for efficient generation of unbiased bits, providing the first known algorithm that generates unbiased random bits from an arbitrary finiteMarkov chain, operates in expected linear time and achieves the information-theoretic upper bound on efficiency. Expand
Generalizing the Blum-Elias method for generating random bits from Markov chains
  • H. Zhou, Jehoshua Bruck
  • Mathematics, Computer Science
  • 2010 IEEE International Symposium on Information Theory
  • 2010
TLDR
Blum's algorithm is generalized to arbitrary degree finite Markov chains and combined with Elias's method for efficient generation of unbiased bits, providing the first known algorithm that generates unbiased random bits from an arbitrary finite MarkOV chain, operates in expected linear time and achieves the information-theoretic upper bound on efficiency. Expand
Blind-friendly von Neumann's Heads or Tails
TLDR
This paper addresses how to extract uniformly distributed bits of information from a nonuniform source and studies some probabilities related to biased dice and coins, culminating in an interesting variation of von Neumann's mechanism that can be employed in a more restricted setting where the actual results of the coin tosses are not known to the contestants. Expand
Blind-friendly von Neumann's Heads or Tails
Abstract The toss of a coin is usually regarded as the epitome of randomness, and has been used for ages as a means to resolve disputes in a simple, fair way. Perhaps as ancient as consulting objectsExpand
Streaming Algorithms for Optimal Generation of Random Bits
TLDR
This paper presents an algorithm that generates random bit streams from biased coins, uses bounded space and runs in expected linear time, and approaches the information-theoretic upper bound on efficiency. Expand
Randomness-optimal oblivious sampling
TLDR
This work presents the first efficient oblivious sampler that uses an optimal number of random bits, up to an arbitrary constant factor bigger than 1, and gives applications to constructive leader election and reducing randomness in interactive proofs. Expand
Optimal random number generation from a biased coin
TLDR
The model of computation is sufficiently general to encompass virtually all previously known algorithms for this problem, and it is proved that it is impossible to construct an optimal tree algorithm recursively using a model based on the algebraic decision tree. Expand
Generating random bits from an arbitrary source: fundamental limits
TLDR
The fixed-length results of this paper provide an operational characterization of the inf-entropy rate of a source and characterize the liminf of the entropy rate, thereby establishing a pleasing duality with the fundamental limits of source coding. Expand
An Improved Method to Extract Quasi-Random Sequences from Generalized Semi-Random Sources
In this paper, we consider new and general models for imperfect sources of randomness, and show how to obtain quasi-random sequences from such sources. Intuitively, quasi-random sequences areExpand
...
1
2
3
4
5
...

References

SHOWING 1-6 OF 6 REFERENCES
Towards a strong communication complexity theory or generating quasi-random sequences from two communicating slightly-random sources
TLDR
Santha and Vazirani consider a very general model for such imperfect sources of randomness: the slightly random source. Expand
Unbiased bits from sources of weak randomness and probabilistic communication complexity
  • B. Chor, Oded Goldreich
  • Mathematics, Computer Science
  • 26th Annual Symposium on Foundations of Computer Science (sfcs 1985)
  • 1985
TLDR
It is shown that most Boolean functions have linear communication complexity in a very strong sense when used to extract almost unbiased and independent bits from the output of any two independent "probability-bounded" sources. Expand
Information Theory and Reliable Communication
Communication Systems and Information Theory. A Measure of Information. Coding for Discrete Sources. Discrete Memoryless Channels and Capacity. The Noisy-Channel Coding Theorem. Techniques for CodingExpand
Various Techniques Use in Connection with Random Digits,
  • National Bureau of Standards, Applied Math Series,
  • 1195