Corpus ID: 167217643

Adaptive Reduced Rank Regression

@article{Wu2020AdaptiveRR,
  title={Adaptive Reduced Rank Regression},
  author={Qiong Wu and Felix Ming Fai Wong and Zhenming Liu and Yanhua Li and Varun Kanade},
  journal={ArXiv},
  year={2020},
  volume={abs/1905.11566}
}
Low rank regression has proven to be useful in a wide range of forecasting problems. However, in settings with a low signal-to-noise ratio, it is known to suffer from severe overfitting. This paper studies the reduced rank regression problem and presents algorithms with provable generalization guarantees. We use adaptive hard rank-thresholding in two different parts of the data analysis pipeline. First, we consider a low rank projection of the data to eliminate the components that are most… Expand
A Deep Learning Framework for Pricing Financial Instruments
TLDR
An integrated deep learning architecture for the stock movement prediction that simultaneously leverages all available alpha sources and designs a graph-based component that extracts cross-sectional interactions which circumvents usage of SVD that's needed in standard models. Expand

References

SHOWING 1-10 OF 75 REFERENCES
Estimation of (near) low-rank matrices with noise and high-dimensional scaling
TLDR
Simulations show excellent agreement with the high-dimensional scaling of the error predicted by the theory, and illustrate their consequences for a number of specific learning models, including low-rank multivariate or multi-task regression, system identification in vector autoregressive processes, and recovery of low- rank matrices from random projections. Expand
Reduced rank regression via adaptive nuclear norm penalization.
TLDR
It is shown that the proposed non-convex penalized regression method has a global optimal solution obtained from an adaptively soft-thresholded singular value decomposition. Expand
Bayesian sparse reduced rank multivariate regression
TLDR
A unified sparse and low-rank multivariate regression method to both estimate the coefficient matrix and obtain its credible region for making inference and utilizes the marginal likelihood to determine the regularization hyperparameter, so the method maximizes its posterior probability given the data. Expand
Reduced rank ridge regression and its kernel extensions
TLDR
A reduced rank ridge regression for multivariate linear regression is proposed that combines the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Expand
Nuclear norm penalization and optimal rates for noisy low rank matrix completion
This paper deals with the trace regression model where $n$ entries or linear combinations of entries of an unknown $m_1\times m_2$ matrix $A_0$ corrupted by noise are observed. We propose a newExpand
Reduced-rank regression for the multivariate linear model
The problem of estimating the regression coefficient matrix having known (reduced) rank for the multivariate linear model when both sets of variates are jointly stochastic is discussed. We show thatExpand
On Robustness of Principal Component Regression
TLDR
This work establishes that PCR is equivalent to performing Linear Regression after pre-processing the covariate matrix via Hard Singular Value Thresholding (HSVT), and establishes a surprising implication of the robustness property of PCR with respect to noise, i.e., PCR can learn a good predictive model even if the covariates are tactfully transformed to preserve differential privacy. Expand
Model Averaging and Dimension Selection for the Singular Value Decomposition
Many multivariate data-analysis techniques for an m × n matrix Y are related to the model Y = M + E, where Y is an m × n matrix of full rank and M is an unobserved mean matrix of rank K < (m ∧ n).Expand
Optimal selection of reduced rank estimators of high-dimensional matrices
We introduce a new criterion, the Rank Selection Criterion (RSC), for selecting the optimal reduced rank estimator of the coefficient matrix in multivariate response regression models. TheExpand
Low-Rank Graph-Regularized Structured Sparse Regression for Identifying Genetic Biomarkers
TLDR
The experimental results on the Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset showed that the proposed method could select the important SNPs to more accurately estimate the brain imaging features than the state-of theart methods. Expand
...
1
2
3
4
5
...