Gaussian discrepancy: a probabilistic relaxation of vector balancing

@article{Chewi2022GaussianDA,
  title={Gaussian discrepancy: a probabilistic relaxation of vector balancing},
  author={Sinho Chewi and P. Dean Gerber and Philippe Rigollet and Paxton Turner},
  journal={Discret. Appl. Math.},
  year={2022},
  volume={322},
  pages={123-141}
}

Figures from this paper

A note on spherical discrepancy

A non-algorithmic, generalized version of a recent result, asserting that a natural relaxation of the Komlós conjecture from boolean discrepancy to spherical discrepancy is true, is proved by a very

References

SHOWING 1-10 OF 41 REFERENCES

Discrepancy minimization via a self-balancing walk

Borders which are tight up to logarithmic factors for online vector balancing against oblivious adversaries are obtained, resolving several questions posed by Bansal, Jiang, Singla, and Sinha (STOC 2020), as well as a linear time algorithm for logarithsmic bounds for the Komlós conjecture.

Balancing vectors and Gaussian measures of n -dimensional convex bodies

Let ‖·‖ be the Euclidean norm on Rn and γn the (standard) Gaussian measure on Rn with density (2π)−n/2e. It is proved that there is a numerical constant c>0 with the following property: if K is an

The Gram-Schmidt walk: a cure for the Banaszczyk blues

This paper gives an efficient randomized algorithm to find a ± 1 combination of the vectors which lies in cK for c>0 an absolute constant, which leads to new efficient algorithms for several problems in discrepancy theory.

Concentration Inequalities

This text attempts to summarize some of the basic tools used in establishing concentration inequalities, which are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms.

The Komlos Conjecture Holds for Vector Colorings

Here it is proved that if the columns of A are assigned unit real vectors rather than +/- 1 then the Komlos conjecture holds with K=1, which opens the way to proving tighter efficient (polynomial-time computable) upper bounds for the conjecture using semidefinite programming techniques.

A nonasymptotic theory of independence

  • Concentration inequalities
  • 2013

High-Dimensional Probability: An Introduction with Applications in Data Science

© 2018, Cambridge University Press Let us summarize our findings. A random projection of a set T in R n onto an m-dimensional subspace approximately preserves the geometry of T if m ⪆ d ( T ) . For...

Lectures on Convex Optimization

This website has lectures on convex optimization to read, not just read, however likewise download them and even read online, as well as obtain the data in the types of txt, zip, kindle, word, ppt, pdf, aswell as rar.

An Algorithm for Komlós Conjecture Matching Banaszczyk's Bound

  • N. BansalD. DadushS. Garg
  • Computer Science, Mathematics
    2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)
  • 2016
An efficient algorithm is given that finds a coloring with discrepancy O((t log n)1/2), matching the best known non-constructive bound for the problem due to Banaszczyk, and gives an algorithmic O(log 1/2 n) bound.

Constructive Algorithms for Discrepancy Minimization

  • N. Bansal
  • Mathematics, Computer Science
    2010 IEEE 51st Annual Symposium on Foundations of Computer Science
  • 2010
The main idea in the algorithms is to produce a coloring over time by letting the color of the elements perform a random walk (with tiny increments) starting from 0 until they reach $\pm 1$.