Sampling lower bounds via information theory

@inproceedings{BarYossef2003SamplingLB,
  title={Sampling lower bounds via information theory},
  author={Ziv Bar-Yossef},
  booktitle={STOC},
  year={2003}
}
We present a novel technique, based on the Jensen-Shannon divergence from information theory, to prove lower bounds on the query complexity of sampling algorithms that approximate functions over arbitrary domain and range. Unlike previous methods, our technique does not use a reduction from a decision promise problem. As a result, it gives stronger bounds for functions that possess a large set of inputs, each two of which exhibit a gap in the function value.We demonstrate the technique with new… CONTINUE READING
Highly Cited
This paper has 41 citations. REVIEW CITATIONS

References

Publications referenced by this paper.
Showing 1-7 of 7 references

Three theorems regarding testing graph properties

Random Struct. Algorithms • 2001
View 5 Excerpts
Highly Influenced

Divergence measures based on the Shannon entropy

IEEE Trans. Information Theory • 1991
View 8 Excerpts
Highly Influenced

Private vs

I. Newman
common random bits in communication complexity. Information Processing Letters, 39:67–71 • 1991
View 3 Excerpts
Highly Influenced

Sequential Analysis - Tests and Confidence Intervals

D. Siegmund
Springer-Verlag • 1985
View 3 Excerpts
Highly Influenced

Similar Papers

Loading similar papers…