Luba Tetruashvili

Learn More
We describe a general scheme for solving nonconvex optimization problems, where in each iteration the nonconvex feasible set is approximated by an inner convex approximation. The latter is defined using an upper bound on the nonconvex constraint functions. Under appropriate conditions on this upper bounding convex function, a monotone convergence to a KKT(More)
In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of(More)
We propose a distributed positioning algorithm to estimate the unknown positions of a number of target nodes, given distance measurements between target nodes and between target nodes and a number of reference nodes at known positions. Based on a geometric interpretation, we formulate the positioning problem as an implicit convex feasibility problem in(More)
We introduce a first-order Mirror-Descent (MD) type algorithm for solving nondifferentiable convex problems having a combination of simple constraint set X (ball, simplex, etc.) and an additional functional constraint. The method is tuned to exploit the structure of X by employing an appropriate non-Euclidean distance-like function. Convergence results and(More)
The projected subgradient method for constrained minimization repeatedly interlaces subgradient steps for the objective function with projections onto the feasible region, which is the intersection of closed and convex constraints sets, to regain feasibility. The latter poses a computational difficulty and, therefore, the projected subgradient method is(More)
In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of(More)
In this paper, a method for solving constrained convex optimization problems is introduced. The problem is cast equivalently as a parametric unconstrained one, the (single) parameter being the optimal value of the original problem. At each stage of the algorithm the parameter is updated and the resulting subproblem is only approximately solved. A linear(More)
The present work is an attempt for the development of user friendly algorithm, applicable to all type of transportation situations. The algorithms developed hitherto address a particular type of transportation problem only. Apart from this several numbers of steps are found to be involved in these algorithms which make them complicated for programming(More)
We consider the problem of minimizing the sum of a strongly convex function and a term comprising the sum of extended real-valued proper closed convex functions. We derive the primal representation of dual-based block descentmethods and establish a relation between primal and dual rates of convergence, allowing to compute the efficiency estimates of(More)
We devise an algorithm for finding the global optimal solution of the so-called optimal power flow problem (OPF) for a class of power networks with a tree topology, also called radial networks, for which an efficient and reliable algorithm was not previously known. The algorithm we present is called the tree reduction/expansion method, and is based on an(More)
  • 1