• Corpus ID: 9597382

A Sampling Technique of Proving Lower Bounds for Noisy Computations

  title={A Sampling Technique of Proving Lower Bounds for Noisy Computations},
  author={Chinmoy Dutta and Jaikumar Radhakrishnan},
We present a technique of proving lower bounds for noisy computations. This is achieved by a theorem connecting computations on a kind of randomized decision trees and sampling based algorithms. This approach is surprisingly powerful, and applicable to several models of computation previously studied. As a first illustration we show how all the results of Evans and Pippenger (SIAM J. Computing, 1999) for noisy decision trees, some of which were derived using Fourier analysis, follow… 

Lower Bounds for Noisy Wireless Networks using Sampling Algorithms

A tight lower bound of Omega(N\log\log N) on the number of transmissions required to compute several functions in a network of N randomly placed sensors, communicating using local transmissions, and operating with power near the connectivity threshold is shown.

Average-Case Lower Bounds for Noisy Boolean Decision Trees

A new method is presented for deriving lower bounds to the expected number of queries made by noisy decision trees computing Boolean functions that has the feature that expectations are taken with respect to a uniformly distributed random input, as well as to the random noise, thus yielding stronger lower bounds.

A tight lower bound for parity in noisy communication networks

We show a tight lower bound of Ω(N log log N) on the number of transmission required to compute the parity of N bits (with constant error) in a network of N randomly placed sensors, communicating

How Hard is Computing Parity with Noisy Communications?

We show a tight lower bound of $\Omega(N \log\log N)$ on the number of transmissions required to compute the parity of $N$ input bits with constant error in a noisy communication network of $N$

Reliable computation with noisy circuits and decision trees-a general n log n lower bound

It is shown that the critical number crit(f) of a function f yields lower bound Omega (crit( f) log crit (f)) for the noisy circuit size, which implies that almost all n-input Boolean functions have noisy decision tree complexity Theta (n log n) in the static as well as in the dynamic case.

Lower bounds for the noisy broadcast problem

It is proved that Gallager's protocol is optimal up to a constant factor in the noisy broadcast model of distributed computation and follows from a lower bound in a new model, the generalized noisy decision tree model, which may be of independent interest.

Computing with Noisy Information

This paper studies the depth of noisy decision trees in which each node gives the wrong answer with some constant probability, giving tight bounds for several problems.

Computing in fault tolerance broadcast networks

  • I. Newman
  • Computer Science
    Proceedings. 19th IEEE Annual Conference on Computational Complexity, 2004.
  • 2004
This work presents here the first linear complexity protocols for several classes of Boolean functions, including the OR function, functions that have O(l)-minterm (maxterm) size, function that have linear size AC/sub 0/ formulae and some other functions.

Finding OR in a noisy broadcast network

Distributed Symmetric Function Computation in Noisy Wireless Sensor Networks with Binary Data

  • Lei YingR. SrikantG. Dullerud
  • Computer Science
    2006 4th International Symposium on Modeling and Optimization in Mobile, Ad Hoc and Wireless Networks
  • 2006
A wireless sensor network consisting of n sensors, each having a recorded bit, the sensor’s measurement, which has been set to either “0” or “1” is considered, and it is shown that any algorithm satisfying the performance constraints must necessarily have energy usage Ω (n(√logn/n)α).