• Publications
  • Influence
Randomized algorithms for estimating the trace of an implicit symmetric positive semi-definite matrix
We analyze the convergence of randomized trace estimators. Starting at 1989, several algorithms have been proposed for estimating the trace of a matrix by 1/M∑<sub>i</sub>=1<sup>M</sup>Expand
Blendenpik: Supercharging LAPACK's Least-Squares Solver
We show that by using a high-quality implementation of one of these techniques, we obtain a solver that performs extremely well in the traditional yardsticks of numerical linear algebra: it is significantly faster than high-performance implementations of existing state-of-the-art algorithms, and it is numerically backward stable. Expand
Faster Subset Selection for Matrices and Applications
We study the following problem of subset selection for matrices: given a matrix $\mathbf{X} \in \mathbb{R}^{n \times m}$ ($m > n$) and a sampling parameter $k$ ($n \le k \le m$), select a subset of $k$, such that the pseudoinverse of the sampled matrix has as small a norm as possible. Expand
Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees
An extended abstract of this work appears in the Proceedings of the 34th International Conference on Machine Learning (ICML 2017) [AKM+17]. Expand
Quasi-Monte Carlo Feature Maps for Shift-Invariant Kernels
We consider the problem of improving the efficiency of randomized Fourier feature maps to accelerate training and testing speed of kernel methods on large datasets. Expand
Efficient and Practical Stochastic Subgradient Descent for Nuclear Norm Regularization
We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Expand
Kernel methods match Deep Neural Networks on TIMIT
In this paper, we develop two algorithmic schemes to address this computational bottleneck in the context of kernel ridge regression. Expand
Faster Kernel Ridge Regression Using Sketching and Preconditioning
Kernel ridge regression is a simple yet powerful technique for nonparametric regression whose computation amounts to solving a linear system. Expand
Revisiting Asynchronous Linear Solvers: Provable Convergence Rate through Randomization
An extended abstract of this work appears in the proceedings of the 28th IEEE International Parallel & Distributed Processing Symposium. Expand
Approximating Spectral Sums of Large-Scale Matrices using Stochastic Chebyshev Approximations
This article is partially based on preliminary results published in the proceeding of the 32nd International Conference on Machine Learning (ICML 2015). Expand