Parallel Algorithms via the Probabilistic Method

@inproceedings{Srivastav2007ParallelAV,
  title={Parallel Algorithms via the Probabilistic Method},
  author={Anand Srivastav and Lasse Kliemann},
  booktitle={Handbook of Parallel Computing},
  year={2007}
}
We give an introduction to the design of parallel algorithms with the probabilistic method. Algorithms of this kind usually possess a randomized sequential counterpart. Parallelization of such algorithms is inherently linked with derandomization, either with the Erdős-Spencer method of conditional probabilities, or exhaustive search in a polynomial sized sample space. The key notation is the treatment of random variables with various concepts of only limited independence, leading to… 

Adaptive density control in heterogeneous wireless sensor networks with and without power management

TLDR
The authors discover a functional relationship between the redundancy, density of nodes in each tier for active coverage and the network lifetime, which is much less pronounced in the absence of heterogeneity.

1 Parallel Algorithms via the Probabilistic Method

Christian-Albrechts Universitat zu Kiel 1.

Derandomizing local distributed algorithms under bandwidth restrictions

TLDR
The congested clique model, which allows all-to-all communication, is addressed, and there is a deterministic maximal independent set algorithm that runs in O ( log 2 Δ ) rounds, where Δ is the maximum degree.

References

SHOWING 1-10 OF 131 REFERENCES

The probabilistic method yields deterministic parallel algorithms

TLDR
Results obtained by applying the method of conditional probabilities to the set balancing problem, lattice approximation, edge-coloring graphs, random sampling, and combinatorial constructions are presented.

Improved parallel approximation of a class of integer programming problems

We present a method to derandomizeRNC algorithms, converting them toNC algorithms. Using it, we show how to approximate a class of NP-hard integer programming problems inNC, to within factors better

(De)randomized Construction of Small Sample Spaces in NC

TLDR
This paper provides the first parallel parallel (NC) algorithm for constructing a compact distribution that satisfies the constraints up to a small relative error and suggests new proof techniques which might be useful in general probabilistic analysis.

The Complexity of Parallel Search

A parallel algorithmic version of the local lemma

  • N. Alon
  • Computer Science, Mathematics
    [1991] Proceedings 32nd Annual Symposium of Foundations of Computer Science
  • 1991
TLDR
The Lovasz local lemma (1975) is a tool that enables one to show that certain events hold with positive, though very small probability, thus providing deterministic NC/sup 1/ algorithms for various interesting algorithmic search problems.

Splitters and near-optimal derandomization

TLDR
A fairly general method for finding deterministic constructions obeying k-restrictions, which yields structures of size not much larger than the probabilistic bound and imply the very efficient derandomization of algorithms in learning, of fixed-subgraph finding algorithms, and of near optimal /spl Sigma/II/Spl Sigma/ threshold formulae.

On the parallel complexity of computing a maximal independent set in a hypergraph

TLDR
It is shown that an algorithm proposed by Beame and Luby is in randomized NC for hypergraphs in which the maximum edge size is bounded by a constant, and the upper tail of sums of dependent random variables defined on the edges of a hypergraph is bound.

Parallel search for maximal independence given minimal dependence

TLDR
A randomized NC algorithm for the case when the size of each minimal dependent set is at most a constant is provided, and an algorithm which is conjecture is a randomized NC algorithms for the general case is provided.

Randomized geometric algorithms and pseudo-random generators

  • K. Mulmuley
  • Computer Science, Mathematics
    Proceedings., 33rd Annual Symposium on Foundations of Computer Science
  • 1992
TLDR
The author shows that the expected running times of most of the randomized incremental algorithms in computational geometry do not change (up to a constant factor), when the sequence of additions is not truly random but is instead generated using only O(log n) random bits.

On Using Deterministic Functions to Reduce Randomness in Probabilistic Algorithms

  • M. Santha
  • Mathematics, Computer Science
    Inf. Comput.
  • 1987
...