• Publications
  • Influence
Spectral Regularization Algorithms for Learning Large Incomplete Matrices
TLDR
Using the nuclear norm as a regularizer, the algorithm Soft-Impute iteratively replaces the missing elements with those obtained from a soft-thresholded SVD in a sequence of regularized low-rank solutions for large-scale matrix completion problems. Expand
Best Subset Selection via a Modern Optimization Lens
In the last twenty-five years (1990-2014), algorithmic advances in integer optimization combined with hardware improvements have resulted in an astonishing 200 billion factor speedup in solving MixedExpand
SparseNet: Coordinate Descent With Nonconvex Penalties
TLDR
The properties of penalties suitable for this approach are characterized, their corresponding threshold functions are studied, and a df-standardizing reparametrization is described that assists the pathwise algorithm. Expand
Matrix completion and low-rank SVD via fast alternating least squares
TLDR
This article develops a software package softlmpute in R for implementing the two approaches for large matrix factorization and completion, and develops a distributed version for very large matrices using the Spark cluster programming environment. Expand
Exact Covariance Thresholding into Connected Components for Large-Scale Graphical Lasso
TLDR
For a range of values of λ, this proposal splits a large graphical lasso problem into smaller tractable problems, making it possible to solve an otherwise infeasible large-scale problem. Expand
The Graphical Lasso: New Insights and Alternatives
TLDR
This paper explains how GLASSO is solving the dual of the graphical lasso penalized likelihood, by block coordinate ascent, and proposes similar primal algorithms P-GLASSO and DP-GLassO, that also operate by block-coordinate descent, where Θ is the optimization target. Expand
An Extended Frank-Wolfe Method with "In-Face" Directions, and Its Application to Low-Rank Matrix Completion
TLDR
This work presents an extension of the Frank-Wolfe method that is designed to induce near-optimal solutions on low-dimensional faces of the feasible region by a new approach to generating ``in-face" directions at each iteration, as well as through new choice rules for selecting between in-face and ``regular" Frank- Wolfe steps. Expand
A Computational Framework for Multivariate Convex Regression and Its Variants
ABSTRACT We study the nonparametric least squares estimator (LSE) of a multivariate convex regression function. The LSE, given as the solution to a quadratic program with O(n2) linear constraints (nExpand
Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms
TLDR
This paper empirically demonstrate that a family of L_0-based estimators can outperform the state-of-the-art sparse learning algorithms in terms of a combination of prediction, estimation, and variable selection metrics under various regimes (e.g., different signal strengths, feature correlations, number of samples and features). Expand
Least quantile regression via modern optimization
We address the Least Quantile of Squares (LQS) (and in particular the Least Median of Squares) regression problem using modern optimization methods. We propose a Mixed Integer Optimization (MIO)Expand
...
1
2
3
4
5
...