#### Filter Results:

#### Publication Year

2014

2016

#### Publication Type

#### Key Phrase

#### Publication Venue

Learn More

Proximal gradient descent (PGD) and stochastic proximal gradient descent (SPGD) are popular methods for solving regularized risk minimization problems in machine learning and statistics. In this paper, we propose and analyze an accelerated variant of these methods in the mini-batch setting. This method incorporates two acceleration techniques: one is… (More)

We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. Unlike SVRG, our method can be directly applied to non-strongly and strongly convex problems. We show that our method achieves… (More)

- Atsushi Nitanda
- 2016

We now prove the Proposition 1 that gives the condition of compactness of sublevel set. Proof. Let B d (r) and S d−1 (r) denote the ball and sphere of radius r, centered at the origin. By affine transformation, we can assume that X * contains the origin O, X * ⊂ B d (1), and X * ∩ S d−1 (1) = φ. Then, we have that for ∀x ∈ S d−1 (1), (∇f (x), x) ≥ f (x) − f… (More)

- ‹
- 1
- ›