# Fast OGDA in continuous and discrete time

@inproceedings{Bo2022FastOI, title={Fast OGDA in continuous and discrete time}, author={Radu Ioan Boţ and Ern{\"o} Robert Csetnek and Dang-Khoa Nguyen}, year={2022} }

In the framework of real Hilbert spaces we study continuous in time dynamics as well as numerical algorithms for the problem of approaching the set of zeros of a single-valued monotone and continuous operator V . The starting point of our investigations is a second order dynamical system that combines a vanishing damping term with the time derivative of V along the trajectory, which can be seen as an analogous of the Hessian-driven damping in case the operator is originating from a potential…

## 4 Citations

### Convergence of Proximal Point and Extragradient-Based Methods Beyond Monotonicity: the Case of Negative Comonotonicity

- Computer Science
- 2022

This work provides tight complexity analyses for the Proximal Point, Extragradient, and Optimistic Gradient methods in this setup, closing some questions on their working guarantees beyond monotonicity.

### Continuous-time Analysis for Variational Inequalities: An Overview and Desiderata

- Computer ScienceArXiv
- 2022

An overview of recent progress in the use of continuous-time perspectives in the analysis and design of methods targeting the broad VI problem class is provided and various desiderata for algorithms that apply to general VIs are formulated.

### Solving Constrained Variational Inequalities via an Interior Point Method

- MathematicsArXiv
- 2022

We develop an interior-point approach to solve constrained variational inequality (cVI) problems. Inspired by the efﬁcacy of the alternating direction method of multipliers (ADMM) method in the…

### Fast Krasnosel'skii-Mann algorithm with a convergence rate of the fixed point iteration of o(1/k)

- MathematicsArXiv
- 2022

The Krasnosel’ski˘ı-Mann (KM) algorithm is the most fundamental iterative scheme designed to ﬁnd a ﬁxed point of an averaged operator in the framework of a real Hilbert space, since it lies at the…

## References

SHOWING 1-10 OF 44 REFERENCES

### The Connection Between Nesterov's Accelerated Methods and Halpern Fixed-Point Iterations

- Mathematics
- 2022

We derive a direct connection between Nesterov’s accelerated first-order algorithm and the Halpern fixed-point iteration scheme for approximating a solution of a co-coercive equation. We show that…

### Exact Optimal Accelerated Complexity for Fixed-Point Iterations

- MathematicsICML
- 2022

Despite the broad use of ﬁxed-point iterations throughout applied mathematics, the optimal convergence rate of general ﬁxed-point problems with nonexpansive nonlinear operators has not been…

### Last-Iterate Convergence of Saddle Point Optimizers via High-Resolution Differential Equations

- Mathematics, Computer ScienceArXiv
- 2021

This work adopts a framework studied in ﬂuid dynamics—known as High-Resolution Diﬀerential Equations (HRDEs)— to design di-erential equation models for several saddle-point optimization methods, and shows that the HRDE of Optimistic Gradient Descent Ascent (OGDA) exhibits last-iterate convergence for general monotone variational inequalities.

### Halpern-Type Accelerated and Splitting Algorithms For Monotone Inclusions

- Computer Science, Mathematics
- 2021

A new type of accelerated algorithms to solve some classes of maximally monotones equations as well as monotone inclusions using a so-called Halpern-type fixed-point iteration to solve convex-concave minimax problems and a new accelerated DR scheme to derive a new variant of the alternating direction method of multipliers (ADMM).

### Extragradient Method: O(1/K) Last-Iterate Convergence for Monotone Variational Inequalities and Connections With Cocoercivity

- MathematicsAISTATS
- 2022

The first lastiterate O(1/K) convergence rate for EG for monotone and Lipschitz VIP without any additional assumptions on the operator is derived and given in terms of reducing the squared norm of the operator.

### Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping

- Computer Science, MathematicsOptimization
- 2021

In a Hilbert space setting, for convex optimization, we show the convergence of the iterates to optimal solutions for a class of accelerated first-order algorithms. They can be interpreted as…

### Accelerated Algorithms for Smooth Convex-Concave Minimax Problems with O(1/k^2) Rate on Squared Gradient Norm

- Computer ScienceICML
- 2021

This work presents algorithms with accelerated O(1/k) last-iterate rates, faster than the existing O( 1/ k) or slower rates for extragradient, Popov, and gradient descent with anchoring, and establishes optimality of the O(2/k), through a matching lower bound.

### Fast Extra Gradient Methods for Smooth Structured Nonconvex-Nonconcave Minimax Problems

- Computer Science
- 2021

A two-time-scale EG with anchoring, named fast extragradient (FEG), that has a fast O(1/k) rate on the squared gradient norm for smooth structured nonconvex-nonconcave problems; the corresponding saddle-gradient operator satisfies the negative comonotonicity condition.

### Newton-like Inertial Dynamics and Proximal Algorithms Governed by Maximally Monotone Operators

- Computer ScienceSIAM J. Optim.
- 2020

The introduction of the Hessian damping in the continuous version of Nesterov's accelerated gradient method provides, by temporal discretization, fast proximal gradient algorithms where the oscilla...

### Continuous Newton-like Inertial Dynamics for Monotone Inclusions

- Mathematics
- 2020

In a Hilbert framework ℌ, we study the convergence properties of a Newton-like inertial dynamical system governed by a general maximally monotone operator A ℌ: → 2ℌ. When A is equal to the…