• Corpus ID: 195767446

Angular separability of data clusters or network communities in geometrical space and its relevance to hyperbolic embedding

@article{Muscoloni2019AngularSO,
  title={Angular separability of data clusters or network communities in geometrical space and its relevance to hyperbolic embedding},
  author={Alessandro Muscoloni and Carlo Vittorio Cannistraci},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.00025}
}
Analysis of 'big data' characterized by high-dimensionality such as word vectors and complex networks requires often their representation in a geometrical space by embedding. Recent developments in machine learning and network geometry have pointed out the hyperbolic space as a useful framework for the representation of this data derived by real complex physical systems. In the hyperbolic space, the radial coordinate of the nodes characterizes their hierarchy, whereas the angular distance… 

Figures and Tables from this paper

Optimisation of the coalescent hyperbolic embedding of complex networks

This work proposes a further optimisation of the angular coordinates in this framework that seems to reduce the logarithmic loss and increase the greedy routing score of the embedding compared to the original version, thereby adding an extra improvement to the quality of the inferred hyperbolic coordinates.

Dimension matters when modeling network communities in hyperbolic spaces

It is shown that there is an important qualitativeerence between the lowest-dimensional model and its higher-dimensional counterparts with respect to how similarity between nodes restricts connection probabilities, and considering only one more dimension allows for more realistic and diverse community structures.

Growing hyperbolic networks beyond two dimensions: the generalised popularity-similarity optimisation model

The dPSO model is introduced, a generalisation of the popularitysimilarity optimisation model to any arbitrary integer dimension d > 2, and shows that their major structural properties can be affected by the dimension of the underlying hyperbolic space in a non-trivial way.

Generalised popularity-similarity optimisation model for growing hyperbolic networks beyond two dimensions

The d PSO model is introduced, a generalisation of the popularity-similarity optimisation model to any arbitrary integer dimension $$d>2$$ d > 2 and shows that their major structural properties can be affected by the dimension of the underlying hyperbolic space in a non-trivial way.

The inherent community structure of hyperbolic networks

This work extracted the communities from the studied networks using well-established community finding methods such as Louvain, Infomap and label propagation, and observed high modularity values indicate that the community structure can become very pronounced under certain conditions.

Modular gateway-ness connectivity and structural core organization in maritime network science

The authors unveiled the architecture of a recent global liner shipping network (GLSN) and show that the structure has evolved to be self-organized with a trade-off between high transportation efficiency and low wiring cost and Ports’ gateway-ness is most highly associated with ports’ economic performance.

References

SHOWING 1-10 OF 46 REFERENCES

Manifold learning and maximum likelihood estimation for hyperbolic network embedding

The Popularity-Similarity (PS) model sustains that clustering and hierarchy, properties common to most networks representing complex systems, are the result of an optimisation process in which nodes

Minimum curvilinear automata with similarity attachment for network embedding and link prediction in the hyperbolic space

It is shown that, according to a mechanism that the authors define similarity attachment, the visited node sequence of a network automaton can efficiently approximate the nodes' angular coordinates in the hyperbolic disk, that actually represent an ordering of their similarities.

Hyperbolic Geometry of Complex Networks

It is shown that targeted transport processes without global topology knowledge are maximally efficient, according to all efficiency measures, in networks with strongest heterogeneity and clustering, and that this efficiency is remarkably robust with respect to even catastrophic disturbances and damages to the network structure.

Soft Communities in Similarity Space

It is concluded that the S1 model can be topologically invariant with respect to the soft-community structure and hidden degrees must depend on angular coordinates, and a method to estimate them is proposed.

Latent Geometry Inspired Graph Dissimilarities Enhance Affinity Propagation Community Detection in Complex Networks

The results demonstrate that the latent geometry inspired dissimilarity measures the authors design bring affinity propagation to equal or outperform current state of the art methods for community detection.

Network Mapping by Replaying Hyperbolic Growth

HyperMap is presented, a simple method to map a given real network to its hyperbolic space and has a remarkable predictive power: Using the resulting map, it can predict missing links in the Internet with high precision, outperforming popular existing methods.

Mercator: uncovering faithful hyperbolic embeddings of complex networks

The results suggest that mixing machine learning and ML techniques in a model-dependent framework can boost the meaningful mapping of complex networks.

Coalescent embedding in the hyperbolic space unsupervisedly discloses the hidden geometry of the brain

The present study represents the first evidence of brain networks' angular coalescence in the hyperbolic space, opening a completely new perspective, possibly towards the realization of latent geometry network markers for evaluation of brain disorders and pathologies.

Link prediction with hyperbolic geometry

It is found that while there exists a multitude of competitive methods to predict obvious easy-to-predict links, hyperbolic link prediction is rarely the best but often competitive, it is the best when the task is to predict less obvious missing links that are really hard to predict.