# Adaptive Reduced Rank Regression

@article{Wu2020AdaptiveRR, title={Adaptive Reduced Rank Regression}, author={Qiong Wu and Felix Ming Fai Wong and Zhenming Liu and Yanhua Li and Varun Kanade}, journal={ArXiv}, year={2020}, volume={abs/1905.11566} }

Low rank regression has proven to be useful in a wide range of forecasting problems. However, in settings with a low signal-to-noise ratio, it is known to suffer from severe overfitting. This paper studies the reduced rank regression problem and presents algorithms with provable generalization guarantees. We use adaptive hard rank-thresholding in two different parts of the data analysis pipeline. First, we consider a low rank projection of the data to eliminate the components that are most… Expand

#### Figures, Tables, and Topics from this paper

#### One Citation

A Deep Learning Framework for Pricing Financial Instruments

- Computer Science
- ArXiv
- 2019

An integrated deep learning architecture for the stock movement prediction that simultaneously leverages all available alpha sources and designs a graph-based component that extracts cross-sectional interactions which circumvents usage of SVD that's needed in standard models. Expand

#### References

SHOWING 1-10 OF 75 REFERENCES

Estimation of (near) low-rank matrices with noise and high-dimensional scaling

- Mathematics, Computer Science
- ICML
- 2010

Simulations show excellent agreement with the high-dimensional scaling of the error predicted by the theory, and illustrate their consequences for a number of specific learning models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low- rank matrices from random projections. Expand

Reduced rank regression via adaptive nuclear norm penalization.

- Mathematics, Medicine
- Biometrika
- 2013

It is shown that the proposed non-convex penalized regression method has a global optimal solution obtained from an adaptively soft-thresholded singular value decomposition. Expand

Bayesian sparse reduced rank multivariate regression

- Mathematics, Computer Science
- J. Multivar. Anal.
- 2017

A unified sparse and low-rank multivariate regression method to both estimate the coefficient matrix and obtain its credible region for making inference and utilizes the marginal likelihood to determine the regularization hyperparameter, so the method maximizes its posterior probability given the data. Expand

Reduced rank ridge regression and its kernel extensions

- Mathematics, Computer Science
- Stat. Anal. Data Min.
- 2011

A reduced rank ridge regression for multivariate linear regression is proposed that combines the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Expand

Nuclear norm penalization and optimal rates for noisy low rank matrix completion

- Mathematics
- 2010

This paper deals with the trace regression model where $n$ entries or linear combinations of entries of an unknown $m_1\times m_2$ matrix $A_0$ corrupted by noise are observed. We propose a new… Expand

Reduced-rank regression for the multivariate linear model

- Mathematics
- 1975

The problem of estimating the regression coefficient matrix having known (reduced) rank for the multivariate linear model when both sets of variates are jointly stochastic is discussed. We show that… Expand

On Robustness of Principal Component Regression

- Computer Science, Mathematics
- NeurIPS
- 2019

This work establishes that PCR is equivalent to performing Linear Regression after pre-processing the covariate matrix via Hard Singular Value Thresholding (HSVT), and establishes a surprising implication of the robustness property of PCR with respect to noise, i.e., PCR can learn a good predictive model even if the covariates are tactfully transformed to preserve differential privacy. Expand

Model Averaging and Dimension Selection for the Singular Value Decomposition

- Mathematics
- 2006

Many multivariate data-analysis techniques for an m × n matrix Y are related to the model Y = M + E, where Y is an m × n matrix of full rank and M is an unobserved mean matrix of rank K < (m ∧ n).… Expand

Optimal selection of reduced rank estimators of high-dimensional matrices

- Mathematics
- 2011

We introduce a new criterion, the Rank Selection Criterion (RSC), for selecting the optimal reduced rank estimator of the coefficient matrix in multivariate response regression models. The… Expand

Low-Rank Graph-Regularized Structured Sparse Regression for Identifying Genetic Biomarkers

- Computer Science, Medicine
- IEEE Transactions on Big Data
- 2017

The experimental results on the Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset showed that the proposed method could select the important SNPs to more accurately estimate the brain imaging features than the state-of theart methods. Expand