Claim Your Author Page
Ensure your research is discoverable on Semantic Scholar. Claiming your author page allows you to personalize the information displayed and manage publications.
Using double-smoothing technique and stochastic mirror descent with inexact oracle we built an optimal algorithm (up to a multiplicative factor) for two-points gradient-free non-smooth stochastic… (More)
This paper seeks to address how to solve non-smooth convex and strongly convex optimization problems with functional constraints. The introduced Mirror Descent (MD) method with adaptive stepsizes is… (More)
We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different… (More)
Extension of the mirror descent method developed for convex stochastic optimization problems to constrained convex stochastic optimization problems (subject to functional inequality constraints) is… (More)
In this paper we propose a new approach to obtain mixing least square regression estimate by means of stochastic online mirror descent in non-euclidian set-up.
We propose primal-dual stochastic mirror descent for the convex optimization problems with functional constraints. We obtain the rate of convergence in terms of probability of large deviations.
We study nonsmooth convex stochastic optimization problems with a two-point zero-order oracle, i.e., at each iteration one can observe the values of the function’s realization at two selected points.… (More)
Mirror Descent (MD) is a well-known method of solving non-smooth convex optimization problems. This paper analyzes the stochastic variant of MD with adaptive stepsizes. Its convergence on average is… (More)