#### Filter Results:

#### Publication Year

2009

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

Proximal gradient descent (PGD) and stochastic proximal gradient descent (SPGD) are popular methods for solving regularized risk minimization problems in machine learning and statistics. In this paper, we propose and analyze an accelerated variant of these methods in the mini-batch setting. This method incorporates two acceleration techniques: one is… (More)

We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. Unlike SVRG, our method can be directly applied to non-strongly and strongly convex problems. We show that our method achieves… (More)

- Atsushi Nitanda
- 2016

We now prove the Proposition 1 that gives the condition of compactness of sublevel set. Proof. Let B d (r) and S d−1 (r) denote the ball and sphere of radius r, centered at the origin. By affine transformation, we can assume that X * contains the origin O, X * ⊂ B d (1), and X * ∩ S d−1 (1) = φ. Then, we have that for ∀x ∈ S d−1 (1), (∇f (x), x) ≥ f (x) − f… (More)

Let f be a meromorphic mapping from C n into a compact complex manifold M. In this paper we give some estimates of the growth of the proximity function m f (r, D) of f with respect to a divisor D. J.E. Littlewood [2] (cf. Hay-man [1]) proved that a merormorphic function g on the complex plane C satisfies lim sup r→∞ mg(r,a) log T (r,g) ≤ 1 2 for almost all… (More)

- ‹
- 1
- ›