• Corpus ID: 221819534

Distributionally Robust Variance Minimization: Tight Variance Bounds over $f$-Divergence Neighborhoods

@article{Birrell2020DistributionallyRV,
  title={Distributionally Robust Variance Minimization: Tight Variance Bounds over \$f\$-Divergence Neighborhoods},
  author={Jeremiah Birrell},
  journal={arXiv: Optimization and Control},
  year={2020}
}
  • Jeremiah Birrell
  • Published 19 September 2020
  • Mathematics
  • arXiv: Optimization and Control
Distributionally robust optimization (DRO) is a widely used framework for optimizing objective functionals in the presence of both randomness and model-form uncertainty. A key step in the practical solution of many DRO problems is a tractable reformulation of the optimization over the chosen model ambiguity set, which is generally infinite dimensional. Previous works have solved this problem in the case where the objective functional is an expected value. In this paper we study objective… 

References

SHOWING 1-10 OF 41 REFERENCES
Kullback-Leibler divergence constrained distributionally robust optimization
TLDR
The main contribution of the paper is to show that the KL divergence constrained DRO problems are often of the same complexity as their original stochastic programming problems and, thus, KL divergence appears a good candidate in modeling distribution ambiguities in mathematical programming.
Distributionally Robust Convex Optimization
TLDR
A unifying framework for modeling and solving distributionally robust optimization problems and introduces standardized ambiguity sets that contain all distributions with prescribed conic representable confidence sets and with mean values residing on an affine manifold.
Robust Solutions of Optimization Problems Affected by Uncertain Probabilities
TLDR
The robust counterpart of a linear optimization problem with φ-divergence uncertainty is tractable for most of the choices of φ typically considered in the literature and extended to problems that are nonlinear in the optimization variables.
Data-driven distributionally robust optimization using the Wasserstein metric: performance guarantees and tractable reformulations
TLDR
It is demonstrated that the distributionally robust optimization problems over Wasserstein balls can in fact be reformulated as finite convex programs—in many interesting cases even as tractable linear programs.
Recovering Best Statistical Guarantees via the Empirical Divergence-Based Distributionally Robust Optimization
  • H. Lam
  • Computer Science, Mathematics
    Oper. Res.
  • 2019
TLDR
This work investigates the use of distributionally robust optimization as a tractable tool to recover the asymptotic statistical guarantees provided by the Central Limit Theorem, and shows that using empirically defined Burg-entropy divergence balls to construct the DRO can attain such guarantees.
Robust Bounds on Risk-Sensitive Functionals via Rényi Divergence
TLDR
This formula characterizes the dependence of risk-sensitive functionals and related quantities determined by tail behavior to perturbations in the underlying distributions, in terms of the Renyi divergence, giving rise to upper and lower bounds that are meaningful for all values of a large deviation scaling parameter.
Distributionally Robust Optimization Under Moment Uncertainty with Application to Data-Driven Problems
TLDR
This paper proposes a model that describes uncertainty in both the distribution form (discrete, Gaussian, exponential, etc.) and moments (mean and covariance matrix) and demonstrates that for a wide range of cost functions the associated distributionally robust stochastic program can be solved efficiently.
Variance Minimization in Stochastic Systems
TLDR
A novel solution approach is developed in this chapter to tackle variance minimization problems by exploring special features in variance minimized by adoptingvexification and separation schemes.
Distributionally Robust Optimization and Its Tractable Approximations
TLDR
A modular framework is presented to obtain an approximate solution to the problem that is distributionally robust and more flexible than the standard technique of using linear rules.
Distributionally Robust Stochastic Optimization with Wasserstein Distance
Distributionally robust stochastic optimization (DRSO) is an approach to optimization under uncertainty in which, instead of assuming that there is an underlying probability distribution that is
...
1
2
3
4
5
...