Corpus ID: 143423434

Determining the number of factors in a forecast model by a random matrix test: cryptocurrencies

@article{Medina2019DeterminingTN,
  title={Determining the number of factors in a forecast model by a random matrix test: cryptocurrencies},
  author={A. Medina and Graciela Gonz'alez-Far'ias},
  journal={arXiv: Statistical Finance},
  year={2019}
}
We determine the number of statistically significant factors in a forecast model using a random matrices test. The applied forecast model is of the type of Reduced Rank Regression (RRR), in particular, we chose a flavor which can be seen as the Canonical Correlation Analysis (CCA). As empirical data, we use cryptocurrencies at hour frequency, where the variable selection was made by a criterion from information theory. The results are consistent with the usual visual inspection, with the… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 37 REFERENCES
Reduced-rank regression for the multivariate linear model
The problem of estimating the regression coefficient matrix having known (reduced) rank for the multivariate linear model when both sets of variates are jointly stochastic is discussed. We show thatExpand
Noise Dressing of Financial Correlation Matrices
We show that results from the theory of random matrices are potentially of great interest to understand the statistical structure of the empirical correlation matrices appearing in the study of priceExpand
Universal and Nonuniversal Properties of Cross Correlations in Financial Time Series
We use methods of random matrix theory to analyze the cross-correlation matrix C of price changes of the largest 1000 US stocks for the 2-year period 1994-95. We find that the statistics of most ofExpand
Collective behavior of cryptocurrency price changes
Digital assets termed cryptocurrencies are correlated. We analyze cross correlations between price changes of different cryptocurrencies using methods of random matrix theory and minimum spanningExpand
Distribution of the Estimators for Autoregressive Time Series with a Unit Root
Abstract Let n observations Y 1, Y 2, ···, Y n be generated by the model Y t = pY t−1 + e t , where Y 0 is a fixed constant and {e t } t-1 n is a sequence of independent normal random variables withExpand
High dimensional statistical inference and random matrices
Multivariate statistical analysis is concerned with observations on several variables which are thought to possess some degree of inter-dependence. Driven by problems in genetics and the socialExpand
Transfer entropy as a log-likelihood ratio.
TLDR
The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense. Expand
APPROXIMATE NULL DISTRIBUTION OF THE LARGEST ROOT IN MULTIVARIATE ANALYSIS.
  • I. Johnstone
  • Mathematics, Medicine
  • The annals of applied statistics
  • 2009
TLDR
This work describes a simple approximation, based on the Tracy-Widom distribution, that in many cases can be used instead of tables or software, at least for initial screening. Expand
MULTIVARIATE ANALYSIS AND JACOBI ENSEMBLES: LARGEST EIGENVALUE, TRACY-WIDOM LIMITS AND RATES OF CONVERGENCE.
TLDR
It is shown that after centering and, scaling, the distribution of the largest eigenvalue of (A + B)(-1)B is approximated to second-order, O(p(-2/3), by the Tracy-Widom law. Expand
On the distribution of the largest eigenvalue in principal components analysis
Let x (1) denote the square of the largest singular value of an n x p matrix X, all of whose entries are independent standard Gaussian variates. Equivalently, x (1) is the largest principal componentExpand
...
1
2
3
4
...