• Publications
  • Influence
Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm
TLDR
The first algorithm analyzed has better dependence on $\varepsilon$ in the complexity bound, but also is not specific to entropic regularization and can solve the OT problem with different regularizers.
On the Complexity of Approximating Wasserstein Barycenters
TLDR
The complexity of approximating the Wasserstein barycenter of m discrete measures, or histograms of size n, is studied by contrasting two alternative approaches that use entropic regularization, and a novel proximal-IBP algorithm is proposed which is seen as a proximal gradient method.
Optimal Tensor Methods in Smooth Convex and Uniformly ConvexOptimization
TLDR
A new tensor method is proposed, which closes the gap between the lower and upper iteration complexity bounds for convex optimization problems with the objective function having Lipshitz-continuous $p$-th order derivative, and it is shown that in practice it is faster than the best known accelerated Tensor method.
Mirror Descent and Convex Optimization Problems with Non-smooth Inequality Constraints
TLDR
One of its focus is to propose a Mirror Descent with adaptive stepsizes and adaptive stopping rule for problems with objective function, which is not Lipschitz, e.g., is a quadratic function.
Decentralize and Randomize: Faster Algorithm for Wasserstein Barycenters
TLDR
A novel accelerated primal-dual stochastic gradient method is developed and applied to the decentralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem.
Self-concordant analysis of Frank-Wolfe algorithms
TLDR
The theory of SC functions is used to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations, and if the problem admits a stronger local linear minimization oracle, a novel FW method with linear convergence rate for SC functions.
Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle
TLDR
The first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle and can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non-Euclidean setup.
Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
TLDR
It is demonstrated how in practice one can efficiently use the combination of line-search and primal-duality by considering a convex optimization problem with a simple structure (for example, linearly constrained).
Accelerated Alternating Minimization, Accelerated Sinkhorn's Algorithm and Accelerated Iterative Bregman Projections.
TLDR
A generic accelerated alternating minimization method and its primal-dual modification for problems with linear constraints enjoying a $1/k^2 convergence rate, where $k$ is the iteration counter is proposed and has faster convergence in practice than the Sinkhorn's algorithm.
...
1
2
3
4
5
...