# Limiting Spectrum of Randomized Hadamard Transform and Optimal Iterative Sketching Methods

@article{Lacotte2020LimitingSO, title={Limiting Spectrum of Randomized Hadamard Transform and Optimal Iterative Sketching Methods}, author={Jonathan Lacotte and Sifan Liu and Edgar Dobriban and Mert Pilanci}, journal={ArXiv}, year={2020}, volume={abs/2002.00864} }

We provide an exact analysis of the limiting spectrum of matrices randomly projected either with the subsampled randomized Hadamard transform, or truncated Haar matrices. We characterize this limiting distribution through its Stieltjes transform, a classical object in random matrix theory, and compute the first and second inverse moments. We leverage the limiting spectrum and asymptotic freeness of random matrices to obtain an exact analysis of iterative sketching methods for solving least…

## 8 Citations

### Optimal Randomized First-Order Methods for Least-Squares Problems

- Computer Science, MathematicsICML
- 2020

The derivation of the limiting spectral density of SRHT embeddings is derived and the resulting algorithm yields the best known complexity for solving least-squares problems with no condition number dependence.

### On randomized sketching algorithms and the Tracy–Widom law

- Computer ScienceStatistics and computing
- 2023

It is shown that random matrix theory, in particular the Tracy–Widom law, is useful for describing the operating characteristics of sketching algorithms in the tall-data regime when the sample size n is much greater than the number of variables d.

### Debiasing Distributed Second Order Optimization with Surrogate Sketching and Scaled Regularization

- Mathematics, Computer ScienceNeurIPS
- 2020

This work introduces a new technique for debiasing the local estimates, which leads to both theoretical and empirical improvements in the convergence rate of distributed second order methods.

### Newton-LESS: Sparsification without Trade-offs for the Sketched Newton Update

- Computer ScienceNeurIPS
- 2021

It is proved that Newton-LESS enjoys nearly the same problem-independent local convergence rate as Gaussian embeddings, not just up to constant factors but even down to lower order terms, for a large class of optimization tasks.

### Adaptive Newton Sketch: Linear-time Optimization with Quadratic Convergence and Effective Hessian Dimensionality

- Computer ScienceICML
- 2021

A randomized algorithm with quadratic convergence rate for convex optimization problems with a self-concordant, composite, strongly convex objective function based on performing an approximate Newton step using a random projection of the Hessian.

### Lower Bounds and a Near-Optimal Shrinkage Estimator for Least Squares Using Random Projections

- Computer Science, MathematicsIEEE Journal on Selected Areas in Information Theory
- 2020

An upper bound on the expected error of this estimator is derived, which is smaller than the error of the classical Gaussian sketch solution for any given data, and works for other common sketching methods as well.

### Training Quantized Neural Networks to Global Optimality via Semidefinite Programming

- Computer ScienceICML
- 2021

Surprisingly, it is shown that certain quantized NN problems can be solved to global optimality provably in polynomial time in all relevant parameters via tight semidefinite relaxations.

### How to Reduce Dimension With PCA and Random Projections?

- Computer ScienceIEEE Transactions on Information Theory
- 2021

This work compute the performance of several popular sketching methods in a general “signal-plus-noise” (or spiked) data model and finds that signal strength decreases under projection in a delicate way depending on the structure of the data and the sketching method.

## References

SHOWING 1-10 OF 34 REFERENCES

### Improved Matrix Algorithms via the Subsampled Randomized Hadamard Transform

- Computer Science, MathematicsSIAM J. Matrix Anal. Appl.
- 2013

This article addresses the efficacy, in the Frobenius and spectral norms, of an SRHT-based low-rank matrix approximation technique introduced by Woolfe, Liberty, Rohklin, and Tygert, and produces several results on matrix operations with SRHTs that may be of independent interest.

### Faster Least Squares Optimization

- Computer Science, MathematicsArXiv
- 2019

This work investigates randomized methods for solving overdetermined linear least-squares problems, where the Hessian is approximated based on a random projection of the data matrix, and shows that a fixed subspace embedding with momentum yields the fastest rate of convergence, along with the lowest computational complexity.

### Randomized sketches of convex programs with sharp guarantees

- Computer Science2014 IEEE International Symposium on Information Theory
- 2014

This work analyzes RP-based approximations of convex programs, in which the original optimization problem is approximated by the solution of a lower-dimensional problem, and proves that the approximation ratio of this procedure can be bounded in terms of the geometry of constraint set.

### High-Dimensional Optimization in Adaptive Random Subspaces

- Computer Science, MathematicsNeurIPS
- 2019

It is shown that an adaptive sampling strategy for the random subspace significantly outperforms the oblivious sampling method, and the improvement in the relative error of the solution can be tightly characterized in terms of the spectrum of the data matrix, and provide probabilistic upper-bounds.

### Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

- Computer ScienceSIAM Rev.
- 2011

This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.

### Asymptotically liberating sequences of random unitary matrices

- Mathematics, Computer Science
- 2013

### Randomized Algorithms for Low-Rank Matrix Factorizations: Sharp Performance Bounds

- Computer ScienceAlgorithmica
- 2014

A novel and rather intuitive analysis of the algorithm for approximating an input matrix with a low-rank element for dimensionality reduction is introduced, which allows it to derive sharp estimates and give new insights about its performance.

### Asymptotics for Sketching in Least Squares Regression

- Computer Science, MathematicsNeurIPS
- 2019

The limits of the accuracy loss (for estimation and test error) incurred by popular sketching methods are found, and separation between different methods is shown, so that SRHT is better than Gaussian projections.

### Improved Analysis of the subsampled Randomized Hadamard Transform

- Computer Science, MathematicsAdv. Data Sci. Adapt. Anal.
- 2011

An improved analysis of a structured dimension-reduction map called the subsampled randomized Hadamard transform is presented, and it offers optimal constants in the estimate on the number of dimensions required for the embedding.

### Iterative Hessian Sketch: Fast and Accurate Solution Approximation for Constrained Least-Squares

- Computer ScienceJ. Mach. Learn. Res.
- 2016

This work provides a general lower bound on any randomized method that sketches both the data matrix and vector in a least-squares problem and presents a new method known as the iterative Hessian sketch, which can be used to obtain approximations to the original least- Squares problem using a projection dimension proportional to the statistical complexity of the least-Squares minimizer, and a logarithmic number of iterations.