On Graduated Optimization for Stochastic Non-Convex Problems


The graduated optimization approach, also known as the continuation method, is a popular heuristic to solving non-convex problems that has received renewed interest over the last decade. Despite being popular, very little is known in terms of its theoretical convergence analysis. In this paper we describe a new first-order algorithm based on graduated… (More)


5 Figures and Tables


Citations per Year

Citation Velocity: 24

Averaging 24 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Slides referencing similar topics