Author pages are created from data sourced from our academic publisher partnerships and public sources.
Share This Author
Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm
The first algorithm analyzed has better dependence on $\varepsilon$ in the complexity bound, but also is not specific to entropic regularization and can solve the OT problem with different regularizers.
On the Complexity of Approximating Wasserstein Barycenters
- Alexey Kroshnin, N. Tupitsa, D. Dvinskikh, P. Dvurechensky, A. Gasnikov, César A. Uribe
- Computer ScienceICML
- 24 May 2019
The complexity of approximating the Wasserstein barycenter of m discrete measures, or histograms of size n, is studied by contrasting two alternative approaches that use entropic regularization, and a novel proximal-IBP algorithm is proposed which is seen as a proximal gradient method.
Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization
- A. Gasnikov, P. Dvurechensky, Eduard A. Gorbunov, E. Vorontsova, Daniil Selikhanovych, César A. Uribe
- Computer Science, MathematicsCOLT
- 3 February 2019
A new tensor method is proposed, which closes the gap between the lower and upper iteration complexity bounds for convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, and it is shown that in practice it is faster than the best known accelerated Tensor method.
Mirror Descent and Convex Optimization Problems with Non-smooth Inequality Constraints
- A. Bayandina, P. Dvurechensky, A. Gasnikov, F. Stonyakin, A. Titov
- Computer Science, Mathematics
- 18 October 2017
One of its focus is to propose a Mirror Descent with adaptive stepsizes and adaptive stopping rule for problems with objective function, which is not Lipschitz, e.g., is a quadratic function.
Decentralize and Randomize: Faster Algorithm for Wasserstein Barycenters
- P. Dvurechensky, D. Dvinskikh, A. Gasnikov, César A. Uribe, Angelia Nedi'c
- Computer Science, MathematicsNeurIPS
- 11 June 2018
A novel accelerated primal-dual stochastic gradient method is developed and applied to the decentralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem.
Self-concordant analysis of Frank-Wolfe algorithms
- P. Dvurechensky, Shimrit Shtern, Mathias Staudigl, P. Ostroukhov, K. Safin
- Computer Science, MathematicsICML
- 11 February 2020
The theory of SC functions is used to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations, and if the problem admits a stronger local linear minimization oracle, a novel FW method with linear convergence rate for SC functions.
Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle
The first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle and can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non-Euclidean setup.
Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
It is demonstrated how in practice one can efficiently use the combination of line-search and primal-duality by considering a convex optimization problem with a simple structure (for example, linearly constrained).
Near Optimal Methods for Minimizing Convex Functions with Lipschitz $p$-th Derivatives
Accelerated Alternating Minimization, Accelerated Sinkhorn's Algorithm and Accelerated Iterative Bregman Projections.
A generic accelerated alternating minimization method and its primal-dual modification for problems with linear constraints enjoying a $1/k^2 convergence rate, where $k$ is the iteration counter is proposed and has faster convergence in practice than the Sinkhorn's algorithm.