Backward nested descriptors asymptotics with inference on stem cell differentiation

  title={Backward nested descriptors asymptotics with inference on stem cell differentiation},
  author={Stephan F. Huckemann and Benjamin Eltzner},
  journal={The Annals of Statistics},
For sequences of random backward nested subspaces as occur, say, in dimension reduction for manifold or stratified space valued data, asymptotic results are derived. In fact, we formulate our results more generally for backward nested families of descriptors (BNFD). Under rather general conditions, asymptotic strong consistency holds. Under additional, still rather general hypotheses, among them existence of a.s. local twice differentiable charts, asymptotic joint normality of a BNFD can be… 

Figures and Tables from this paper

Applying Backward Nested Subspace Inference to Tori and Polyspheres

In this contribution, applications of bootstrap two-sample tests for the torus and its higher dimensional generalizations, polyspheres are illustrated.

Essentials of backward nested descriptors inference

Principal component analysis (PCA) is a popular device for dimension reduction and their asymptotics are well known. In particular, principal components through the mean span the data with decreasing

Statistical Methods Generalizing Principal Component Analysis to Non-Euclidean Spaces

Very generally speaking, statistical data analysis builds on descriptors reflecting data distributions. In a linear context, well studied nonparametric descriptors are means and PCs (principal

A smeary central limit theorem for manifolds with application to high-dimensional spheres

The (CLT) central limit theorems for generalized Frechet means (data descriptors assuming values in stratified spaces, such as intrinsic means, geodesics, etc.) on manifolds from the literature are

Data analysis on nonstandard spaces

The task to write on data analysis on nonstandard spaces is quite substantial, with a huge body of literature to cover, from parametric to nonparametrics, from shape spaces to Wasserstein spaces. In

Projected Statistical Methods for Distributional Data on the Real Line with the Wasserstein Metric

We present a novel class of projected methods, to perform statistical analysis on a data set of probability distributions on the real line, with the 2-Wasserstein metric. We focus in particular on

Recent advances in directional statistics

This paper provides a review of the many recent developments in the field since the publication of Mardia and Jupp (1999), still the most comprehensive text on directional statistics, and considers developments for the exploratory analysis of directional data.

Principal component analysis for functional data on Riemannian manifolds and spheres

Functional data analysis on nonlinear manifolds has drawn recent interest. Sphere-valued functional data, which are encountered for example as movement trajectories on the surface of the earth, are

Scaled Torus Principal Component Analysis

Numerical experiments illustrate how ST-PCA can be used to achieve meaningful dimensionality reduction on low-dimensional torii, particularly with the purpose of clusters separation, while two data applications in astronomy and molecular biology show that ST- PCA outperforms existing methods for the investigated datasets.

Measure Dependent Asymptotic Rate of the Mean: Geometrical and Topological Smeariness

We revisit the generalized central limit theorem (CLT) for the Frechet mean on hyperspheres. It has been found by Eltzner and Huckemann (2019) that for some probability measures, the sample mean



Analysis of principal nested spheres.

Analysis of principal nested spheres provides an intuitive and flexible decomposition of the high-dimensional sphere and an interesting special case of the analysis results in finding principal geodesics, similar to those from previous approaches to manifold principal component analysis.

The circular SiZer, inferred persistence of shape parameters and application to early stem cell differentiation.

It turns out that only the wrapped Gaussian kernel gives a symmetric, strongly Lipschitz semi-group satisfying "circular" causality, that is, not introducing possibly artificial modes with increasing levels of smoothing.

Intrinsic MANOVA for Riemannian Manifolds with an Application to Kendall's Space of Planar Shapes

By determining the asymptotic distributions of respective sample covariances under parallel transport, it is shown that they can be compared by standard MANOVA and can detect height effects that are otherwise not identifiable.

On the meaning of mean shape: manifold stability, locus and the two sample test

Various concepts of mean shape previously unrelated in the literature are brought into relation. In particular, for non-manifolds, such as Kendall’s 3D shape space, this paper answers the question,

Omnibus CLTs for Fr\'echet means and nonparametric inference on non-Euclidean spaces

Two central limit theorems for sample Fr\'echet means are derived, both significant for nonparametric inference on non-Euclidean spaces. The first one, Theorem 2.2, encompasses and improves upon most


This article develops nonparametric inference procedures for estimation and testing problems for means on manifolds. A central limit theorem for Frechet sample means is derived leading to an

Barycentric Subspaces and Affine Spans in Manifolds

Barycentric subspaces are implicitly defined as the locus of points which are weighted means of \(k+1\) reference points which contains the Frechet mean and it is shown that this definition defines locally a submanifold of dimension k and that it generalizes in some sense geodesic subspaced.

Barycentric subspace analysis on manifolds

  • X. Pennec
  • Mathematics
    The Annals of Statistics
  • 2018
This paper investigates the generalization of Principal Component Analysis (PCA) to Riemannian manifolds. We first propose a new and more general type of family of subspaces in manifolds that we call


The shape-space l. k m whose points a represent the shapes of not totally degenerate /c-ads in IR m is introduced as a quotient space carrying the quotient metric. When m = 1, we find that Y\ = S k ~

Nested Sphere Statistics of Skeletal Models

A method analogous to principal component analysis called composite principal nested spheres will be seen to apply to learning a more efficient collection of modes of object variation about a new and more representative mean object than those provided by other representations and other statistical analysis methods.