Corpus ID: 220280330

Fr\'echet Sufficient Dimension Reduction for Random Objects

@article{Ying2020FrechetSD,
  title={Fr\'echet Sufficient Dimension Reduction for Random Objects},
  author={Chao Ying and Zhou Yu},
  journal={arXiv: Statistics Theory},
  year={2020}
}
We in this paper consider Frechet sufficient dimension reduction with responses being complex random objects in a metric space and high dimension Euclidean predictors. We propose a novel approach called weighted inverse regression ensemble method for linear Frechet sufficient dimension reduction. The method is further generalized as a new operator defined on reproducing kernel Hilbert spaces for nonlinear Frechet sufficient dimension reduction. We provide theoretical guarantees for the new… Expand

Figures from this paper

Dimension Reduction and Data Visualization for Fr\'echet Regression
  • Qi Zhang, Lingzhou Xue, Bing Li
  • Mathematics
  • 2021
With the rapid development of data collection techniques, complex data objects that are not in the Euclidean space are frequently encountered in new statistical applications. Fréchet regression modelExpand
Single Index Fr\'echet Regression
Single index models provide an effective dimension reduction tool in regression, especially for high dimensional data, by projecting a general multivariate predictor onto a direction vector. WeExpand
Single Index Fréchet Regression
Single index models provide an effective dimension reduction tool in regression, especially for high dimensional data, by projecting a general multivariate predictor onto a direction vector. WeExpand

References

SHOWING 1-10 OF 23 REFERENCES
A general theory for nonlinear sufficient dimension reduction: Formulation and estimation
In this paper we introduce a general theory for nonlinear sufficient dimension reduction, and explore its ramifications and scope. This theory subsumes recent work employing reproducing kernelExpand
Fréchet analysis of variance for random objects
Fréchet mean and variance provide a way of obtaining a mean and variance for metric space-valued random variables, and can be used for statistical analysis of data objects that lie in abstractExpand
On a Projective Resampling Method for Dimension Reduction With Multivariate Responses
Consider the dimension reduction problem where both the response and the predictor are vectors. Existing estimators of this problem take one of the following routes: (1) targeting the part of theExpand
A global geometric framework for nonlinear dimensionality reduction.
TLDR
An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure. Expand
Sufficient Dimension Reduction via Inverse Regression
A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regressionExpand
Statistical Consistency of Kernel Canonical Correlation Analysis
TLDR
A mathematical proof of the statistical convergence of kernel CCA is given, providing a theoretical justification for the method and giving a sufficient condition for convergence on the regularization coefficient involved inkernel CCA. Expand
Metric spaces and positive definite functions
As poo we get the space Em with the distance function maxi-, ... I xi X. Let, furthermore, lP stand for the space of real sequences with the series of pth powers of the absolute values convergent.Expand
Nonlinear dimensionality reduction by locally linear embedding.
TLDR
Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds. Expand
Kernel Methods for Measuring Independence
TLDR
Two new functionals, the constrained covariance and the kernel mutual information, are introduced to measure the degree of independence of random variables and it is proved that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent. Expand
On Certain Metric Spaces Arising From Euclidean Spaces by a Change of Metric and Their Imbedding in Hilbert Space
1. W. A. Wilson ([9])2 has recently investigated those metric spaces which arise from a metric space by taking as its new metric a suitable (one variable) function of the old one. He considered inExpand
...
1
2
3
...