Corpus ID: 15971655

Old and New Matrix Algebra Useful for Statistics

@inproceedings{Minka2000OldAN,
  title={Old and New Matrix Algebra Useful for Statistics},
  author={Thomas P. Minka},
  year={2000}
}
The partials with respect to the numerator are laid out according to the shape of Y while the partials with respect to the denominator are laid out according to the transpose of X. For example, dy/dx is a column vector while dy/dx is a row vector (assuming x and y are column vectors—otherwise it is flipped). Each of these derivatives can be tediously computed via partials, but this section shows how they instead can be computed with matrix manipulations. The material is based on Magnus and… Expand
Patterned complex-valued matrix derivatives
A systematic and simple method is proposed for how to find the derivative of complex-valued matrix functions which depend on matrix arguments that contain patterns. The proposed method is developedExpand
An extended collection of matrix derivative results for forward and reverse mode algorithmic dieren tiation
This paper collects together a number of matrix derivative results which are very useful in forward and reverse mode algorithmic differentiation (AD). It highlights in particular the remarkableExpand
Efficient Automatic Differentiation of Matrix Functions
Forward and reverse mode automatic differentiation methods for functions that take a vector argument make derivative computation efficient. However, the determinant and inverse of a matrix are notExpand
Towards a Unified Framework of Matrix Derivatives
TLDR
This paper establishes a sufficient condition under which not only the first approach can be applied but also the time complexity of results obtained from the second approaches can be reduced, and enables the framework of matrix derivatives, which would result in various applications in science and engineering. Expand
Hessians of scalar functions of complex-valued matrices: A systematic computational approach
  • A. Hjørungnes, D. Gesbert
  • Mathematics, Computer Science
  • 2007 9th International Symposium on Signal Processing and Its Applications
  • 2007
TLDR
It is shown how the four Hessian matrices of a scalar complex function can be identified from the second-order complex differential of the scalar function. Expand
What is the gradient of a scalar function of a symmetric matrix ?
TLDR
It is demonstrated that the relation between the gradient of a real-valued function of a symmetric matrix and the definition of a \frechet derivative is incorrect, and it is proved that G_s = \mathrm{sym}(G)$. Expand
A note on matrix differentiation
This paper presents a set of rules for matrix differentiation with respect to a vector of parameters, using the flattered representation of derivatives, i.e. in form of a matrix. We also introduce aExpand
Vector and Matrix Calculus
  • H. Kamper
  • Theoretical and Mathematical Physics
  • 2018
As explained in detail in [1], there unfortunately exists multiple competing notations concerning the layout of matrix derivatives. This can cause a lot of difficulty when consulting several sources,Expand
A note on differentiating matrices
This paper presents a set of rules for matrix differentiation with respect to a vector of parameters, using the flattered representation of derivatives, i.e. in form of a matrix. We also introduce aExpand
Complex-Valued Matrix Differentiation: Techniques and Key Results
TLDR
In the framework introduced, the differential of the complex-valued matrix function is used to identify the derivatives of this function and Matrix differentiation results are derived and summarized in tables. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 14 REFERENCES
Matrix Differential Calculus with Applications in Statistics and Econometrics
Preface MATRICES: Basic Properties of Vectors and Matrices Kronecker Products, the Vec-Operator and the Moore- Penrose Inverse Miscellaneous Matrix Results DIFFERENTIALS: THE THEORY: MathematicalExpand
Matrix Algebra Useful for Statistics
Basic Operations. Special Matrices. Determinants. Inverse Matrices. Rank. Canonical Forms. Generalized Inverses. Solving Linear Equations. Partitioned Matrices. Eigenvalues and Eigenvectors.Expand
An algorithm for associating the features of two images
TLDR
An algorithm that operates on the distances between features in the two related images and delivers a set of correspondences between them and will recover the feature mappings that result from image translation, expansion or shear deformation even when the displacements of individual features depart slightly from the general trend. Expand
The Theory of Matrices
Volume 2: XI. Complex symmetric, skew-symmetric, and orthogonal matrices: 1. Some formulas for complex orthogonal and unitary matrices 2. Polar decomposition of a complex matrix 3. The normal form ofExpand
Tensor Geometry: The Geometric Viewpoint and its Uses
0. Fundamental Not(at)ions.- I. Real Vector Spaces.- II. Affine Spaces.- III. Dual Spaces.- IV. Metric Vector Spaces.- V. Tensors and Multilinear Forms.- VI Topological Vector Spaces.- VII.Expand
Topics in Matrix Analysis
1. The field of values 2. Stable matrices and inertia 3. Singular value inequalities 4. Matrix equations and Kronecker products 5. Hadamard products 6. Matrices and functions.
Linear models of surface and illuminant spectra.
  • D. Marimont, B. Wandell
  • Computer Science, Physics
  • Journal of the Optical Society of America. A, Optics and image science
  • 1992
TLDR
Low-dimensional linear models used for creating efficient spectral representations for color offer some conceptual simplifications for applications such as printer calibration; they also perform substantially better than principal-components approximations for computer-graphics applications. Expand
Maximum Likelihood Blind Source Separation: A Context-Sensitive Generalization of ICA
TLDR
The resulting algorithm is called cICA, after the (Bell and Sejnowski 1995) Infomax algorithm, which is able to separate in a number of situations where standard methods cannot, including sources with low kurtosis, colored Gaussian sources, and sources which have Gaussian histograms. Expand
Separating Style and Content
TLDR
Blinear models are fitted with bilinear models which explicitly represent the two-factor structure, allowing them to solve three general tasks: extrapolation of a new style to unobserved content; classification of content observed in a newstyle; and translation of new content observation in anew style. Expand
Problems and theorems in linear algebra
Main notations and conventions Determinants Linear spaces Canonical forms of matrices and linear operators Matrices of special form Multilinear algebra Matrix inequalities Matrices in algebra andExpand
...
1
2
...