# Compressed Learning : Universal Sparse Dimensionality Reduction and Learning in the Measurement Domain

@inproceedings{Calderbank2009CompressedL, title={Compressed Learning : Universal Sparse Dimensionality Reduction and Learning in the Measurement Domain}, author={Robert Calderbank}, year={2009} }

In this paper, we provide theoretical results to show that compressed learning , learning directly in the compressed domain, is possible. In Particular, we provide tight bounds demonstrating that the linear kernel SVM’s classifier in the measurement domain, with high probability, has true accuracy close to the accuracy of the best linear threshold classifier in the data domain. We show that this is beneficial both from the compressed sensing and the machine learning points of view. Furthermore…

## 168 Citations

### Generalization Error Analysis of Quantized Compressive Learning

- Computer ScienceNeurIPS
- 2019

This paper considers the learning problem where the projected data is further compressed by scalar quantization, which is called quantized compressive learning, and shows that the inner product estimators have deep connection with NN and linear classification problem through the variance of their debiased counterparts.

### 2D compressed learning: support matrix machine with bilinear random projections

- Computer ScienceMachine Learning
- 2019

A 2D compressed learning paradigm to learn the SMM classifier in some compressed data domain is considered and it is shown that the Kronecker product measurement matrices used by KCS satisfies the restricted isometry property (RIP), which is a property to ensure the learnability of the compressed data.

### From Affine Rank Minimization Solution to Sparse Modeling

- Computer Science2017 IEEE Winter Conference on Applications of Computer Vision (WACV)
- 2017

This work tackles the problem of feature representation from the context of sparsity and affine rank minimization by leveraging compressed sensing from the learning perspective in order to provide answers to questions about sparse representation in signal processing and machine learning.

### There and Back Again: A General Approach to Learning Sparse Models

- Computer ScienceArXiv
- 2017

If the original factors are sparse, then their projections are the sparsest solutions to the projected NMF problem, which explains why the method works for NMF and shows an interesting new property of random projections: they can preserve the solutions of non-convex optimization problems such as NMF.

### Finding needles in compressed haystacks

- Computer Science2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2012

Tight bounds are provided demonstrating that the linear kernel SVMs classifier in the measurement domain, with high probability, has true accuracy close to the accuracy of the best linear threshold classifiers in the data domain.

### Compressed classification learning with Markov chain samples

- Computer ScienceNeural Networks
- 2014

### ECE 543 Project Report : Learning in Compressed Spaces

- Computer Science
- 2019

This work investigated learning task in the compressed domain, induced by this random projection, which relies on the properties of the measurement matrix, such as preservation of relative geometry.

### Compressed Factorization: Fast and Accurate Low-Rank Factorization of Compressively-Sensed Data

- Computer ScienceICML
- 2019

This work examines the approach of first performing factorization in the compressed domain, and then reconstructing the original high-dimensional factors from the recovered (compressed) factors, and establishes conditions under which this natural approach will provably recover the original factors.

### Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning

- Computer ScienceLVA/ICA
- 2015

This paper proposes an invertible nonlinear dimensionality reduction method via jointly learning dictionaries in both the original high dimensional data space and its low dimensional representation space, which can outperform compressed sensing in task-driven learning problems, such as data visualization.

### Compressed sensing and dimensionality reduction for unsupervised learning. (Échantillonnage compressé et réduction de dimension pour l'apprentissage non supervisé)

- Computer Science
- 2014

This thesis proposes a framework for estimating probability density mixture parameters in which the training data is compressed into a fixed-size representation, and suggests the existence of theoretical guarantees for reconstructing signals belonging to models beyond usual sparse models.

## References

SHOWING 1-10 OF 36 REFERENCES

### Random Projections for Manifold Learning

- Computer Science, MathematicsNIPS
- 2007

This work rigorously proves that with a small number M of random projections of sample points in ℝN belonging to an unknown K-dimensional Euclidean manifold, the intrinsic dimension (ID) of the sample set can be estimated to high accuracy.

### Random Projections of Smooth Manifolds

- Computer Science, MathematicsFound. Comput. Math.
- 2009

Abstract
We propose a new approach for nonadaptive dimensionality reduction of manifold-modeled data, demonstrating that a small number of random linear projections can preserve key information about…

### Combinatorial Algorithms for Compressed Sensing

- Computer Science2006 40th Annual Conference on Information Sciences and Systems
- 2006

The results prove that there exists a single O(klogn)timesn measurement matrix such that any such signal can be reconstructed from these measurements, with error at most O(1) times the worst case error for the class of such signals.

### Construction of a Large Class of Deterministic Sensing Matrices That Satisfy a Statistical Isometry Property

- Computer ScienceIEEE Journal of Selected Topics in Signal Processing
- 2010

Simple criteria are provided that guarantee that a deterministic sensing matrix satisfying these criteria acts as a near isometry on an overwhelming majority of k-sparse signals; in particular, most such signals have a unique representation in the measurement domain.

### Compressed Video Sensing

- Computer Science
- 2007

This work considers video streams and reconstruct volumes from a subset of time series, and shows that they are highly compressible in a wavelet basis, and concludes that compressed sensing is highly appropriate for representing video streams.

### Sparse representations for image classification: learning discriminative and reconstructive non-parametric dictionaries

- Computer Science
- 2008

A framework for learning optimal dictionaries for simultaneous sparse signal representation and robust class classification is introduced, addressing for the first time the explicit incorporation of both reconstruction and discrimination terms in the non-parametric dictionary learning and sparse coding energy.

### Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?

- Computer ScienceIEEE Transactions on Information Theory
- 2006

If the objects of interest are sparse in a fixed basis or compressible, then it is possible to reconstruct f to within very high accuracy from a small number of random measurements by solving a simple linear program.

### A Generalized Restricted Isometry Property

- Computer Science
- 2008

A natural consequence of the existence of special measurement matrices which satisfy the so-called Restricted Isometry Property is described ‐ if a matrix satisfies RIP, then acute angles between sparse vectors are also approximately preserved.

### A Simple Proof of the Restricted Isometry Property for Random Matrices

- Mathematics
- 2008

Abstract
We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main…