• Corpus ID: 245117977

Moving Up the Cluster Tree with the Gradient Flow

@inproceedings{AriasCastro2021MovingUT,
  title={Moving Up the Cluster Tree with the Gradient Flow},
  author={Ery Arias-Castro and Wanli Qiao},
  year={2021}
}
The paper establishes a strong correspondence between two important clustering approaches that emerged in the 1970’s: clustering by level sets or cluster tree as proposed by Hartigan and clustering by gradient lines or gradient flow as proposed by Fukunaga and Hostetler. We do so by showing that we can move up the cluster tree by following the gradient ascent flow. 

Figures from this paper

Fitting a Multi-modal Density by Dynamic Programming
TLDR
A dynamic programming approach to solving the problem of tipping a probability density function when it is constrained to have a given number of modal intervals by providing several data-driven ways for selecting it.

References

SHOWING 1-10 OF 71 REFERENCES
Clustering Algorithms
A Population Background for Nonparametric Density-Based Clustering
TLDR
It is shown that only mild conditions on a sequence of density estimators are needed to ensure that the sequence of modal clusterings that they induce is consistent and two new loss functions are presented, applicable in fact to any clustering methodology, to evaluate the performance of a data-based clustering algorithm with respect to the ideal population goal.
The estimation of the gradient of a density function, with applications in pattern recognition
TLDR
Applications of gradient estimation to pattern recognition are presented using clustering and intrinsic dimensionality problems, with the ultimate goal of providing further understanding of these problems in terms of density gradients.
A Nonparametric Valley-Seeking Technique for Cluster Analysis
TLDR
A general algorithm for finding the optimum classification with respect to a given criterion is derived and for a particular case, the algorithm reduces to a repeated application of a straightforward decision rule which behaves as a valley-seeking technique.
An Asymptotic Equivalence between the Mean-Shift Algorithm and the Cluster Tree
TLDR
This paper proposes two ways of obtaining a partition from the cluster tree and shows that both of them reduce to the partition given by the gradient flow under standard assumptions on the sampling density.
DBSCAN: Optimal Rates For Density-Based Cluster Estimation
TLDR
This work presents a computationally efficient, rate optimal cluster tree estimator based on simple extensions of the popular DBSCAN algorithm, and derives minimax rates for cluster tree estimation under Hölder smooth densities of arbitrary degree.
Nonparametric estimation of surface integrals on level sets
Surface integrals on density level sets often appear in asymptotic results in nonparametric level set estimation (such as for confidence regions and bandwidth selection). Also surface integrals can
...
...