• Corpus ID: 251040598

A Continuous-Time Perspective on Optimal Methods for Monotone Equation Problems

@inproceedings{Lin2022ACP,
  title={A Continuous-Time Perspective on Optimal Methods for Monotone Equation Problems},
  author={Tianyi Lin and Michael I. Jordan},
  year={2022}
}
We study rescaled gradient dynamical systems in a Hilbert space H , where implicit discretization in a finite-dimensional Euclidean space leads to high-order methods for solving monotone equations (MEs). Our framework can be interpreted as a natural generalization of celebrated dual extrapolation method [Nesterov, 2007] from first order to high order via appeal to the regularization toolbox of optimization theory [Nesterov, 2021a,b]. More specifically, we establish the existence and uniqueness of… 

References

SHOWING 1-10 OF 153 REFERENCES

Dual extrapolation and its applications to solving variational inequalities and related problems

  • Y. Nesterov
  • Mathematics, Computer Science
    Math. Program.
  • 2007
TLDR
This paper shows that with an appropriate step-size strategy, their method is optimal both for Lipschitz continuous operators and for the operators with bounded variations.

Perseus: A Simple High-Order Regularization Method for Variational Inequalities

TLDR
This paper proposes a p th -order method which does not require any binary search scheme and is guaranteed to converge to a weak solution with a global rate of O ( ǫ − 2 / ( p +1) ).

A Control-Theoretic Perspective on Optimal High-Order Optimization

TLDR
A control-theoretic perspective on optimal tensor algorithms for minimizing a convex function in a finite-dimensional Euclidean space is provided.

Generalized Optimistic Methods for Convex-Concave Saddle Point Problems

TLDR
This paper distill the underlying idea of optimism to propose a generalized optimistic method, which encompasses the optimistic gradient method as a special case and develops an adaptive line search scheme to select the stepsizes without knowledge of the smoothness coefficients.

Higher-order methods for convex-concave min-max optimization and monotone variational inequalities

TLDR
The results improve upon the iteration complexity of the first-order Mirror Prox method of Nemirovski and the second-order method of Monteiro and Svaiter and give improved convergence rates for constrained convex-concave min-max problems and monotone variational inequalities with higher-order smoothness.

Finite-Dimensional Variational Inequalities and Complementarity Problems

Newton Methods for Nonsmooth Equations.- Global Methods for Nonsmooth Equations.- Equation-Based Algorithms for Complementarity Problems.- Algorithms for Variational Inequalities.- Interior and

The Second-order in Time Continuous Newton Method

Let H be a real Hilbert space and Ф : H ↦ ℝ a twice continuously differentiable function, whose Hessian is Lipschitz continuous on bounded sets. We study the Newton-like second-order in time

Newton-Like Dynamics and Forward-Backward Methods for Structured Monotone Inclusions in Hilbert Spaces

TLDR
Time discretization of these dynamics gives algorithms combining Newton’s method and forward-backward methods for solving structured monotone inclusions, which prove a descent minimizing property and weak convergence to equilibria of the trajectories.

Chemical equilibrium systems as numerical test problems

TLDR
A system of nonlinear equations purported to describe the equilibrium of the products of hydrocarbon combustion does not describe the stated physical problem, a fact which invalidates it as a test of solution methods for chemical equilibrium systems.

Fast convex optimization via inertial dynamics with Hessian driven damping

...