Corpus ID: 236881054

Tail inverse regression for dimension reduction with extreme response

@inproceedings{Aghbalou2021TailIR,
  title={Tail inverse regression for dimension reduction with extreme response},
  author={Anass Aghbalou and Franccois Portier and Anne Sabourin and Chen Zhou},
  year={2021}
}
We consider the problem of dimensionality reduction for prediction of a target Y ∈ R to be explained by a covariate vector X ∈ Rp, with a particular focus on extreme values of Y which are of particular concern for risk management. The general purpose is to reduce the dimensionality of the statistical problem through an orthogonal projection on a lower dimensional subspace of the covariate space. Inspired by the sliced inverse regression (SIR) methods, we develop a novel framework (TIREX, Tail… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 65 REFERENCES
Sufficient Dimension Reduction via Inverse Regression
A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regressionExpand
Dimension reduction in multivariate extreme value analysis
Non-parametric assessment of extreme dependence structures between an arbitrary number of variables, though quite well-established in dimension 2 and recently extended to moderate dimensions such asExpand
Tail dimension reduction for extreme quantile estimation
In a regression context where a response variable Y ∈ ℝ is recorded with a covariate X ∈ ℝp, two situations can occur simultaneously: (a) we are interested in the tail of the conditional distributionExpand
Kernel dimension reduction in regression
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives directly from the formulation of SDR in terms of the conditional independence of the covariate X fromExpand
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
TLDR
This work treats the problem of dimensionality reduction as that of finding a low-dimensional “effective subspace” of X which retains the statistical relationship between X and Y and establishes a general nonparametric characterization of conditional independence using covariance operators on a reproducing kernel Hilbert space. Expand
Sliced Inverse Regression for Dimension Reduction
Abstract Modern advances in computing power have greatly widened scientists' scope in gathering and investigating information from many variables, information which might have been ignored in theExpand
Principal component analysis for multivariate extremes
The first order behavior of multivariate heavy-tailed random vectors above large radial thresholds is ruled by a limit measure in a regular variation framework. For a high dimensional vector, aExpand
Sliced inverse regression with regularizations.
TLDR
The L2 regularization is introduced, and an alternating least-squares algorithm is developed, to enable SIR to work with n < p and highly correlated predictors and simultaneous reduction estimation and predictor selection. Expand
Investigating Smooth Multiple Regression by the Method of Average Derivatives
Abstract Let (x 1, …, xk, y) be a random vector where y denotes a response on the vector x of predictor variables. In this article we propose a technique [termed average derivative estimation (ADE)]Expand
Dimension Reduction in Regressions Through Cumulative Slicing Estimation
In this paper we offer a complete methodology of cumulative slicing estimation to sufficient dimension reduction. In parallel to the classical slicing estimation, we develop three methods that areExpand
...
1
2
3
4
5
...