# Fr\'echet Sufficient Dimension Reduction for Random Objects

@article{Ying2020FrechetSD, title={Fr\'echet Sufficient Dimension Reduction for Random Objects}, author={Chao Ying and Zhou Yu}, journal={arXiv: Statistics Theory}, year={2020} }

We in this paper consider Frechet sufficient dimension reduction with responses being complex random objects in a metric space and high dimension Euclidean predictors. We propose a novel approach called weighted inverse regression ensemble method for linear Frechet sufficient dimension reduction. The method is further generalized as a new operator defined on reproducing kernel Hilbert spaces for nonlinear Frechet sufficient dimension reduction. We provide theoretical guarantees for the new… Expand

#### 3 Citations

Dimension Reduction and Data Visualization for Fr\'echet Regression

- Mathematics
- 2021

With the rapid development of data collection techniques, complex data objects that are not in the Euclidean space are frequently encountered in new statistical applications. Fréchet regression model… Expand

Single Index Fr\'echet Regression

- Mathematics
- 2021

Single index models provide an effective dimension reduction tool in regression, especially for high dimensional data, by projecting a general multivariate predictor onto a direction vector. We… Expand

Single Index Fréchet Regression

- 2021

Single index models provide an effective dimension reduction tool in regression, especially for high dimensional data, by projecting a general multivariate predictor onto a direction vector. We… Expand

#### References

SHOWING 1-10 OF 23 REFERENCES

A general theory for nonlinear sufficient dimension reduction: Formulation and estimation

- Mathematics
- 2013

In this paper we introduce a general theory for nonlinear sufficient dimension reduction, and explore its ramifications and scope. This theory subsumes recent work employing reproducing kernel… Expand

Fréchet analysis of variance for random objects

- Mathematics
- Biometrika
- 2019

Fréchet mean and variance provide a way of obtaining a mean and variance for metric space-valued random variables, and can be used for statistical analysis of data objects that lie in abstract… Expand

On a Projective Resampling Method for Dimension Reduction With Multivariate Responses

- Mathematics
- 2008

Consider the dimension reduction problem where both the response and the predictor are vectors. Existing estimators of this problem take one of the following routes: (1) targeting the part of the… Expand

A global geometric framework for nonlinear dimensionality reduction.

- Medicine
- Science
- 2000

An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure. Expand

Sufficient Dimension Reduction via Inverse Regression

- Mathematics
- 2005

A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regression… Expand

Statistical Consistency of Kernel Canonical Correlation Analysis

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2007

A mathematical proof of the statistical convergence of kernel CCA is given, providing a theoretical justification for the method and giving a sufficient condition for convergence on the regularization coefficient involved inkernel CCA. Expand

Metric spaces and positive definite functions

- Mathematics
- 1938

As poo we get the space Em with the distance function maxi-, ... I xi X. Let, furthermore, lP stand for the space of real sequences with the series of pth powers of the absolute values convergent.… Expand

Nonlinear dimensionality reduction by locally linear embedding.

- Medicine, Computer Science
- Science
- 2000

Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds. Expand

Kernel Methods for Measuring Independence

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2005

Two new functionals, the constrained covariance and the kernel mutual information, are introduced to measure the degree of independence of random variables and it is proved that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent. Expand

On Certain Metric Spaces Arising From Euclidean Spaces by a Change of Metric and Their Imbedding in Hilbert Space

- Mathematics
- 1937

1. W. A. Wilson ([9])2 has recently investigated those metric spaces which arise from a metric space by taking as its new metric a suitable (one variable) function of the old one. He considered in… Expand