#### Filter Results:

- Full text PDF available (21)

#### Publication Year

2012

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Olivier Fercoq, Peter Richtárik
- ArXiv
- 2013

We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (" AdaBoost problem "). We assume the input data defining the loss function is… (More)

- Olivier Fercoq, Peter Richtárik
- SIAM Journal on Optimization
- 2015

We propose a new stochastic coordinate descent method for minimizing the sum of convex functions each of which depends on a small number of coordinates only. Our method (APPROX) is simultaneously Accelerated, Parallel and PROXimal; this is the first time such a method is proposed. In the special case when the number of processors is equal to the number of… (More)

- Frédérique Billy, Jean Clairambault, +4 authors Shoko Saito
- Mathematics and Computers in Simulation
- 2014

We present and analyse in this article a mathematical question with a biological origin, the theoretical treatment of which may have far-reaching implications in the practical treatment of cancers. Starting from biological and clinical observations on cancer cells, tumour-bearing laboratory rodents, and patients with cancer, we ask from a theoretical… (More)

- Olivier Fercoq, Alexandre Gramfort, Joseph Salmon
- ICML
- 2015

Screening rules allow to early discard irrelevant variables from the optimization in Lasso problems , or its derivatives, making solvers faster. In this paper, we propose new versions of the so-called safe rules for the Lasso. Based on duality gap considerations, our new rules create safe test regions whose diameters converge to zero, provided that one… (More)

- Zheng Qu, Peter Richtárik, Martin Takác, Olivier Fercoq
- ICML
- 2016

We propose a new algorithm for minimizing reg-ularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all local curvature information contained in the… (More)

- Olivier Fercoq, Zheng Qu, Peter Richtárik, Martin Takác
- 2014 IEEE International Workshop on Machine…
- 2014

We propose an efficient distributed randomized coordinate descent method for minimizing regularized non-strongly convex loss functions. The method attains the optimal O(1/k<sup>2</sup>) convergence rate, where k is the iteration counter. The core of the work is the theoretical study of stepsize parameters. We have implemented the method on Archer - the… (More)

- Olivier Fercoq, Marianne Akian, Mustapha Bouhtou, Stéphane Gaubert
- IEEE Trans. Automat. Contr.
- 2013

We study a general class of PageRank optimization problems which consist in finding an optimal outlink strategy for a web site subject to design constraints. We consider both a continuous problem, in which one can choose the intensity of a link, and a discrete one, in which in each page, there are obligatory links, facultative links and forbidden links. We… (More)

- Olivier Fercoq
- 2013 12th International Conference on Machine…
- 2013

We design a randomised parallel version of Adaboost based on previous studies on parallel coordinate descent. The algorithm uses the fact that the logarithm of the exponential loss is a function with coordinate-wise Lipschitz continuous gradient, in order to define the step lengths. We provide the proof of convergence for this randomised Adaboost algorithm… (More)