A Stochastic Derivative-Free Optimization Method with Importance Sampling: Theory and Learning to Control

@inproceedings{Bibi2020ASD,
  title={A Stochastic Derivative-Free Optimization Method with Importance Sampling: Theory and Learning to Control},
  author={Adel Bibi and El Houcine Bergou and Ozan Sener and Bernard Ghanem and Peter Richt{\'a}rik},
  booktitle={AAAI},
  year={2020}
}
  • Adel Bibi, El Houcine Bergou, +2 authors Peter Richtárik
  • Published in AAAI 2020
  • Computer Science, Mathematics
  • We consider the problem of unconstrained minimization of a smooth objective function in $\mathbb{R}^n$ in a setting where only function evaluations are possible. While importance sampling is one of the most popular techniques used by machine learning practitioners to accelerate the convergence of their models when applicable, there is not much existing theory for this acceleration in the derivative-free setting. In this paper, we propose an importance sampling version of the stochastic three… CONTINUE READING

    Citations

    Publications citing this paper.
    SHOWING 1-3 OF 3 CITATIONS

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 42 REFERENCES

    Random Gradient-Free Minimization of Convex Functions

    VIEW 2 EXCERPTS

    Adam: A Method for Stochastic Optimization

    VIEW 1 EXCERPT

    Quartz: Randomized Dual Coordinate Ascent with Arbitrary Sampling

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL