Corpus ID: 218673981

Gradient Sampling Methods with Inexact Subproblem Solutions and Gradient Aggregation

@article{Curtis2020GradientSM,
  title={Gradient Sampling Methods with Inexact Subproblem Solutions and Gradient Aggregation},
  author={Frank E. Curtis and Minhan Li},
  journal={arXiv: Optimization and Control},
  year={2020}
}
  • Frank E. Curtis, Minhan Li
  • Published 2020
  • Mathematics
  • arXiv: Optimization and Control
  • Gradient sampling (GS) has proved to be an effective methodology for the minimization of nonsmooth, nonconvex objective functions. The most computationally expensive component of a contemporary GS method is the need to solve a convex quadratic subproblem in each iteration. In this paper, a strategy is proposed that allows the use of inexact solutions of these subproblems, which, as proved in the paper, can be incorporated without the loss of theoretical convergence guarantees. Numerical… CONTINUE READING

    Figures and Tables from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 45 REFERENCES
    An adaptive gradient sampling algorithm for non-smooth optimization
    40
    A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization
    12
    A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization
    77
    A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization
    370
    A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
    28
    Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization
    70
    A Redistributed Proximal Bundle Method for Nonconvex Optimization
    69
    A Linearization Algorithm for Nonsmooth Minimization
    18