# Gradient Sampling Methods with Inexact Subproblem Solutions and Gradient Aggregation

@article{Curtis2020GradientSM, title={Gradient Sampling Methods with Inexact Subproblem Solutions and Gradient Aggregation}, author={Frank E. Curtis and Minhan Li}, journal={arXiv: Optimization and Control}, year={2020} }

Gradient sampling (GS) has proved to be an effective methodology for the minimization of nonsmooth, nonconvex objective functions. The most computationally expensive component of a contemporary GS method is the need to solve a convex quadratic subproblem in each iteration. In this paper, a strategy is proposed that allows the use of inexact solutions of these subproblems, which, as proved in the paper, can be incorporated without the loss of theoretical convergence guarantees. Numerical… CONTINUE READING

One Citation

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 45 REFERENCES

An adaptive gradient sampling algorithm for non-smooth optimization

- Computer Science, Mathematics
- 2013

40

A feasible SQP-GS algorithm for nonconvex, nonsmooth constrained optimization

- Computer Science, Mathematics
- 2012

12

A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization

- Mathematics, Computer Science
- 2012

77

A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization

- Mathematics, Computer Science
- 2005

370

A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees

- Mathematics, Computer Science
- 2015

28

Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization

- Mathematics, Computer Science
- 2007

70

A Redistributed Proximal Bundle Method for Nonconvex Optimization

- Mathematics, Computer Science
- 2010

69