Corpus ID: 220935660

The Importance of Being Correlated: Implications of Dependence in Joint Spectral Inference across Multiple Networks

@article{Pantazis2020TheIO,
  title={The Importance of Being Correlated: Implications of Dependence in Joint Spectral Inference across Multiple Networks},
  author={Konstantinos Pantazis and A. Athreya and Jes{\'u}s Arroyo and W. Frost and E. Hill and V. Lyzinski},
  journal={arXiv: Methodology},
  year={2020}
}
Spectral inference on multiple networks is a rapidly-developing subfield of graph statistics. Recent work has demonstrated that joint, or simultaneous, spectral embedding of multiple independent network realizations can deliver more accurate estimation than individual spectral decompositions of those same networks. Little attention has been paid, however, to the network correlation that such joint embedding procedures necessarily induce. In this paper, we present a detailed analysis of induced… Expand
1 Citations
Bias-Variance Tradeoffs in Joint Spectral Embeddings
Latent position models and their corresponding estimation procedures offer a statistically principled paradigm for multiple network inference by translating multiple network analysis problems toExpand

References

SHOWING 1-10 OF 83 REFERENCES
Model-Based Clustering
TLDR
A review of work to date in model-based clustering, from the famous paper by Wolfe in 1965 to work that is currently available only in preprint form, and a look ahead to the next decade or so. Expand
K
  • Levin, , V. Lyzinski, Y. Park, Y. Qin, D. L. Sussman, M. Tang, J. T. Vogelstein, and C. E. Priebe. Statistical inference on random dot product graphs: a survey. Journal of Machine Learning Research, 18
  • 2018
Spectral clustering and the high-dimensional stochastic blockmodel
Networks or graphs can easily represent a diverse set of data sources that are characterized by interacting units or actors. Social ne tworks, representing people who communicate with each other, areExpand
A central limit theorem for an omnibus embedding of random dot product graphs
Performing statistical analyses on collections of graphs is of import to many disciplines, but principled, scalable methods for multisample graph inference are few. In this paper, we describe anExpand
A Limit Theorem for Scaled Eigenvectors of Random Dot Product Graphs
Abstract We prove a central limit theorem for the components of the largest eigenvectors of the adjacency matrix of a finite-dimensional random dot product graph whose true latent positions areExpand
Joint Embedding of Graphs
TLDR
It is demonstrated that the joint embedding method produces features which lead to state of the art performance in classifying graphs, and proposes a random graph model for multiple graphs that generalizes other classical models for graphs. Expand
Bias-Variance Tradeoffs in Joint Spectral Embeddings
Latent position models and their corresponding estimation procedures offer a statistically principled paradigm for multiple network inference by translating multiple network analysis problems toExpand
Community Detection on Mixture Multi-layer Networks via Regularized Tensor Decomposition
TLDR
This is the first systematic study on the mixture multi-layer networks using tensor decomposition, and it is shown that the TWIST procedure can accurately detect the communities with small misclassification error as the number of nodes and/or theNumber of layers increases. Expand
Consistency of Spectral Clustering on Hierarchical Stochastic Block Models
We propose a generic network model, based on the Stochastic Block Model, to study the hierarchy of communities in real-world networks, under which the connection probabilities are structured in aExpand
Consistent detection and optimal localization of all detectable change points in piecewise stationary arbitrarily sparse network-sequences
TLDR
It is shown that the proposed algorithms can detect (resp. localize) all change points, where the change in the expected adjacency matrix is above the minimax detectability threshold, consistently without any a priori assumption about the sparsity of the underlying networks. Expand
...
1
2
3
4
5
...