Modern Multivariate Statistical Techniques

  title={Modern Multivariate Statistical Techniques},
  author={Alan J. Izenman},
CHAPTER 3 Page 46, line –15: (K × J)-matrix. Page 47, Equation (3.5): −EF should be −EF . Page 49, line –6: R should be <. Page 53, line –7: “see Exercise 3.4” is not relevant here. Page 53, Equation (3.43): Last term on rhs should be ∂yJ ∂xK . Page 60, Equation (3.98): σ should be σ. Page 61, line 8: (3.106) should be (3.105). Pages 61, 62, Equations (3.109), (3.110), and (3.111): The identity matrices have different dimensions — In the top row of each matrix, the identity matrix has dimension… 

Linear Dimensionality Reduction

  • A. Franc
  • Mathematics, Computer Science
  • 2022
These notes are an overview of some classical linear methods in Multivariate Data Analysis. This is a good old domain, well established since the 60's, and refreshed timely as a key step in

General Regression Models

In this chapter the authors consider dependent data but move from the linear models of Chap.8 to general regression models and, more briefly, nonlinear models and generalized estimating equations.

Factor Analysis

This work considers versions of factor analysis based on matrix decomposition methods, method of moments, and likelihood estimation with normality assumptions based on partial isotropy model and extended to the case of heteroscedastic measurement error.

Introduction and Motivating Examples

It is shown that in many instances, carefully thought out Bayesian and frequentist analyses will provide similar conclusions; however, situations in which one or the other approach may be preferred are also described.

On information plus noise kernel random matrices

This paper considers the case where the data is of the type "information + noise," and shows that the spectral properties of kernel random matrices can be understood from a new kernel matrix, computed only from the signal part of the data, but using (in general) a slightly different kernel.

High-Dimensional Properties of AIC and Cp for Estimation of Dimensionality in Multivariate Models

The AIC and Cp have been proposed for estimation of the dimensionality in some multivariate models. In this paper we consider high-dimensional properties of the criteria in multivariate linear model

Reproducing kernel Hilbert spaces

  • L. Rosasco
  • Computer Science
    High-Dimensional Statistics
  • 2019
The concept of “kernels” will provide us with a flexible, computationally feasible method for implementing Regularization, which requires a (possibly large) class of models and a method for evaluating the complexity of each model in the class.

Layered Multivariate Regression with Its Applications

Multivariate regression is known as a multivariate extension of multiple regression, which explain/predict the variations in multiple dependent variables by multiple independent variables. Recently,

Introduction to manifold learning

There has been a flurry of research activity on nonlinear manifold learning, which includes Isomap, local linear embedding, Laplacian eigenmaps, Hessian eigens, and diffusion maps, and a brief survey of these new methods is given.

High-dimensional properties of AIC, BIC and Cp for estimation of dimensionality in canonical correlation analysis

This paper is concerned with consistency properties of the dimensionality estimation criteria AIC, BIC and Cp in CCA (Canonical Correlation Analysis) between p variables and q (≤ p) variables, based

Data Visualization With Multidimensional Scaling

This article discusses methodology for multidimensional scaling (MDS) and its implementation in two software systems, GGvis and XGvis, and shows applications to the mapping of computer usage data, to the dimension reduction of marketing segmentation data,to the layout of mathematical graphs and social networks, and finally to the spatial reconstruction of molecules.

Visualization Methodology for Multidimensional Scaling

These uncertainties will be addressed by the following interactive techniques: (a) algorithm animation, random restarts, and manual editing of configurations, (b) interactive control over parameters

Statistical Learning from a Regression Perspective

This paper presents a meta-modelling framework called CART, which automates the very labor-intensive and therefore time-heavy and expensive process of Classification and Regression Trees (CART) that is currently used in statistical inference.