# Linearly Independent, Orthogonal, and Uncorrelated Variables

```@article{Rodgers1984LinearlyIO,
title={Linearly Independent, Orthogonal, and Uncorrelated Variables},
author={Joseph Lee Rodgers and W Alan Nicewander and Larry E. Toothaker},
journal={The American Statistician},
year={1984},
volume={38},
pages={133-134}
}```
• Published 1984
• Mathematics
• The American Statistician
Abstract Linearly independent, orthogonal, and uncorrelated are three terms used to indicate lack of relationship between variables. This short didactic article compares these three terms in both an algebraic and a geometric framework. An example is used to illustrate the differences.
71 Citations
Orthogonality, uncorrelatedness, and linear independence of vectors
A lot of chemometrics is expressed in terms of vectors in multidimensional space. In this article, we will look at various properties of vectors, namely, linear independence, orthogonality, andExpand
Foundations of Linear and Generalized Linear Models
Written by a highly-experienced author, Foundations of Linear and Generalized Linear Models is a clear and comprehensive guide to the key concepts and results of linearstatistical models. The bookExpand
Linear Estimation with Regressor Decomposition
• Mathematics
• 2014
A statistical approach is proposed to estimating the coefficients of a multiple linear regression with regressor decomposition. The estimation formulas and their applicability conditions take intoExpand
Type I, Type II and Type III Sums of Squares†
Whenever factorial designs have equal cell frequencies, the factors are orthogonal. Although orthogonal factors are not required, orthogonal factors permit a unique decomposition of the variance inExpand
Equivalence between Mean Independence and Zero Correlation of the Error Term with any Function of the Covariates
This paper proves that the mean independence of the error term from the covariates in a linear regression model is equivalent to, rather than just a sufficient condition for, the error term beingExpand
Generating spatially-constrained null models for irregularly spaced data using Moran spectral randomization methods
• Mathematics
• 2015
This is the accepted manuscript of an article published by Wiley. The manuscript does not include figures. To view figures, please consult the publisher's version of this article.
Thirteen ways to look at the correlation coefficient
• Mathematics
• 1988
Abstract In 1885, Sir Francis Galton first defined the term “regression” and completed the theory of bivariate correlation. A decade later, Karl Pearson developed the index that we still use toExpand
On a scale as a sum of manifest variables.
In this commentary, basics of identifying a latent structure using measured variables with a minimum linear algebra are reviewed and the technique is demonstrated using Fisher's iris data as an illustration. Expand
Uncorrelated Multilinear Discriminant Analysis With Regularization and Aggregation for Tensor Object Recognition
• Medicine, Computer Science
• IEEE Transactions on Neural Networks
• 2009
The UMLDA aims to extract uncorrelated discriminative features directly from tensorial data through solving a tensor-to-vector projection, and an adaptive regularization procedure is incorporated to enhance the performance in the small sample size (SSS) scenario. Expand
Significance Testing of Congruence Coefficients: A Good Idea?
Tucker's congruence coefficient is often used to compare the equality of latent structures on a given test for different subgroups. Initial use of the index was subjective; given the same congruenceExpand