#### Filter Results:

- Full text PDF available (20)

#### Publication Year

2012

2017

- This year (1)
- Last 5 years (23)
- Last 10 years (23)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Olivier Fercoq, Peter Richtárik
- ArXiv
- 2013

We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (“AdaBoost problem”). We assume the input data defining the loss function is… (More)

- Olivier Fercoq, Peter Richtárik
- SIAM Journal on Optimization
- 2015

We propose a new stochastic coordinate descent method for minimizing the sum of convex functions each of which depends on a small number of coordinates only. Our method (APPROX) is simultaneously Accelerated, Parallel and PROXimal; this is the first time such a method is proposed. In the special case when the number of processors is equal to the number of… (More)

- Olivier Fercoq, Alexandre Gramfort, Joseph Salmon
- ICML
- 2015

Screening rules allow to early discard irrelevant variables from the optimization in Lasso problems, or its derivatives, making solvers faster. In this paper, we propose new versions of the socalled safe rules for the Lasso. Based on duality gap considerations, our new rules create safe test regions whose diameters converge to zero, provided that one relies… (More)

High dimensional regression benefits from sparsity promoting regularizations. Screening rules leverage the known sparsity of the solution by ignoring some variables in the optimization, hence speeding up solvers. When the procedure is proven not to discard features wrongly the rules are said to be safe. In this paper we derive new safe rules for generalized… (More)

Optimising drug delivery in the general circulation targeted towards cancer cell populations, but inevitably reaching also proliferating healthy cell populations imposes to design optimised drug infusion algorithms in a dynamic way, i.e., controlling the growth of both populations simultaneously by the action of the drugs in use, wanted for cancer cells,… (More)

- Zheng Qu, Peter Richtárik, Martin Takác, Olivier Fercoq
- ICML
- 2016

We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all local curvature information contained in the… (More)

- Olivier Fercoq, Zheng Qu, Peter Richtárik, Martin Takác
- 2014 IEEE International Workshop on Machine…
- 2014

We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k<sup>2</sup>) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the… (More)

- Frédérique Billy, Jean Clairambault, +4 authors Shoko Saito
- Mathematics and Computers in Simulation
- 2014

We present and analyse in this article a mathematical question with a biological origin, the theoretical treatment of which may have far-reaching implications in the practical treatment of cancers. Starting from biological and clinical observations on cancer cells, tumourbearing laboratory rodents, and patients with cancer, we ask from a theoretical biology… (More)

- Olivier Fercoq, Marianne Akian, Mustapha Bouhtou, Stéphane Gaubert
- IEEE Trans. Automat. Contr.
- 2013

We study a general class of PageRank optimization problems which consist in finding an optimal outlink strategy for a web site subject to design constraints. We consider both a continuous problem, in which one can choose the intensity of a link, and a discrete one, in which in each page, there are obligatory links, facultative links and forbidden links. We… (More)

- Olivier Fercoq
- 2013 12th International Conference on Machine…
- 2013

We design a randomised parallel version of Adaboost based on previous studies on parallel coordinate descent. The algorithm uses the fact that the logarithm of the exponential loss is a function with coordinate-wise Lipschitz continuous gradient, in order to define the step lengths. We provide the proof of convergence for this randomised Adaboost algorithm… (More)