#### Filter Results:

- Full text PDF available (8)

#### Publication Year

2015

2019

- This year (8)
- Last 5 years (11)
- Last 10 years (11)

#### Publication Type

#### Co-author

#### Journals and Conferences

Learn More

- Ching-pei Lee, Stephen J. Wright
- Comp. Opt. and Appl.
- 2019

Successive quadratic approximations, or second-order proximal methods, are useful for minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part that promotes… (More)

Variants of the coordinate descent approach for minimizing a nonlinear function are distinguished in part by the order in which coordinates are considered for relaxation. Three common orderings are… (More)

We consider coordinate descent methods on convex quadratic problems, in which exact line searches are performed at each iteration. (This algorithm is identical to Gauss-Seidel on the equivalent… (More)

- Ching-pei Lee, Stephen J. Wright
- ArXiv
- 2017

We propose an approach based on neural networks and the AC power flow equations to identify singleand doubleline outages in a power grid using the information from phasor measurement unit sensors… (More)

- Ching-pei Lee, Cong Han Lim, Stephen J. Wright
- KDD
- 2018

We propose a communication- and computation-efficient distributed optimization algorithm using second-order information for solving ERM problems with a nonsmooth regularization term. Current… (More)

- Huikun Zhang, Spencer S. Ericksen, +8 authors Michael A. Newton
- 2019

Successive quadratic approximations, or second-order proximal methods, are useful for minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part that promotes… (More)

It is well known that both gradient descent and stochastic coordinate descent achieve a global convergence rate of O(1/k) in the objective value, when applied to a scheme for minimizing a… (More)

- Wei-Lin Chiang, Yu-Sheng Li, Ching-pei Lee, Chih-Jen Lin
- SDM
- 2018

For distributed linear classification, L1 regularization is useful because of a smaller model size. However, with the non-differentiability, it is more difficult to develop efficient optimization… (More)

Block-coordinate descent (BCD) is a popular method for large-scale regularized optimization problems with block-separable structure. However, existing analyses require either a fixed second-order… (More)