Estimating the Average of a Lipschitz-Continuous Function from One Sample

@inproceedings{Das2010EstimatingTA,
  title={Estimating the Average of a Lipschitz-Continuous Function from One Sample},
  author={Abhimanyu Das and David Kempe},
  booktitle={ESA},
  year={2010}
}
We study the problem of estimating the average of a Lipschitz continuous function f defined over a metric space, by querying f at only a single point. More specifically, we explore the role of randomness in drawing this sample. Our goal is to find a distribution minimizing the expected estimation error against an adversarially chosen Lipschitz continuous function. Our work falls into the broad class of estimating aggregate statistics of a function from a small number of carefully chosen samples… 

References

SHOWING 1-10 OF 18 REFERENCES
Optimally Adaptive Integration of Univariate Lipschitz Functions
TLDR
This work considers the problem of approximately integrating a Lipschitz function f over an interval and gives a deterministic algorithm that uses O(DOPT(f,e) · log(e−1/DopT( f,e))) samples and shows that an asymptotically better algorithm is impossible.
Average case complexity of linear multivariate problems I. Theory
Average Case Complexity of Linear Multivariate Problems Part I: Theory
TLDR
It is shown how optimal sample points from the worst case setting can be used in the average case, as well as how optimal algorithms and average case complexity functions for linear multivariate problems equipped with the folded Wiener sheet measure are applied.
Bounded geometries, fractals, and low-distortion embeddings
TLDR
This work considers both general doubling metrics, as well as more restricted families such as those arising from trees, from graphs excluding a fixed minor, and from snowflaked metrics, which contains many families of metrics that occur in applied settings.
Sensor Selection for Minimizing Worst-Case Prediction Error
  • Abhimanyu Das, D. Kempe
  • Computer Science
    2008 International Conference on Information Processing in Sensor Networks (ipsn 2008)
  • 2008
TLDR
The problem of choosing the "best" subset of k sensors to sample from among a sensor deployment of n > k sensors is studied, and it is shown that for any aggregate function satisfying certain concavity, symmetry and monotonicity conditions, the sensor selection problem can be modeled as a k-median clustering problem, and solved using efficient approximation algorithms designed for k- Medial clustering.
Average case complexity of linear multivariate problems II. Applications
Average case complexity of linear multivariate problems
TLDR
It is proved that tractability of an LMP in Λ std is equivalent to tractability in Κ all, although the proof is no(constructive), and the optimal design problem for an L MP is addressed by using a relation to the worst case setting.
Stochastic properties of quadrature formulas
TLDR
Nonlinear methods, adaptive methods, or even methods with varying cardinality are not significantly better (with respect to certain stochastic error bounds) than the simplest linear methods.
The ellipsoid method and its consequences in combinatorial optimization
TLDR
The method yields polynomial algorithms for vertex packing in perfect graphs, for the matching and matroid intersection problems, for optimum covering of directed cuts of a digraph, and for the minimum value of a submodular set function.
Local Search Heuristics for k-Median and Facility Location Problems
TLDR
This work analyzes local search heuristics for the metric k-median and facility location problems and shows that local search with swaps has a locality gap of 5 and introduces a new local search operation which opens one or more copies of a facility and drops zero or more facilities.
...
1
2
...