A new model for learning in graph domains

@article{Gori2005ANM,
  title={A new model for learning in graph domains},
  author={Marco Gori and Gabriele Monfardini and Franco Scarselli},
  journal={Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.},
  year={2005},
  volume={2},
  pages={729-734 vol. 2}
}
In several applications the information is naturally represented by graphs. [] Key Method GNNs extends recursive neural networks and can be applied on most of the practically useful kinds of graphs, including directed, undirected, labelled and cyclic graphs. A learning algorithm for GNNs is proposed and some experiments are discussed which assess the properties of the model.

Figures and Tables from this paper

An Investigation Into Graph Neural Networks
TLDR
GCN models can successfully process graph data and also outperforms traditional fully connected networks by 4% of classification accuracy, and it has proven that due to high interpretability, new architectures and libraries, and performance, there is a dramatic increase in applications and research of GNNs as a graphical analysis tool.
A Modular Framework for Unsupervised Graph Representation Learning
TLDR
A hyperparameter study allows us to identify a particularly strong method of representation learning for the tasks of link prediction and node classification, and following recent advances in optimal transport for machine learning, a method to learn node representations using Wasserstein spaces.
Artificial Neural Networks for Processing Graphs with Application to Image Understanding: A Survey
TLDR
This chapter proposes a survey of neural network models able to process structured information, with a particular focus on those architectures tailored to address image understanding applications.
An In-depth Analysis of Graph Neural Networks for Semi-supervised Learning
TLDR
A thorough experiment is performed for several prominent GCN-related models, including GAT, AGNN, Co-Training GCN and Stochastic GCN, found that different models take their advantages in different scenarios, depending on training set size, graph structure and datasets.
Graph Partition Neural Networks for Semi-Supervised Classification
TLDR
Experimental results indicate that GPNNs are either superior or comparable to state-of-the-art methods on a wide variety of datasets for graph-based semi-supervised classification and can achieve similar performance as standard GNNs with fewer propagation steps.
Dynamic Graph Convolutional Networks
Learning Graph Representations
TLDR
This paper discusses the graph convolutional neural networks graph autoencoders and spatio-temporal graph neural networks, and the representations of the graph in lower dimensions can be learned using these methods.
Machine Learning on Graphs: A Model and Comprehensive Taxonomy
TLDR
A comprehensive taxonomy of representation learning methods for graph-structured data is proposed, aiming to unify several disparate bodies of work and provide a solid foundation for understanding the intuition behind these methods, and enables future research in the area.
Graph neural networks for classification : models and applications
TLDR
A new architecture is proposed, with a selfattentive mechanism that allows a graph neural network to attend over its own input in the context of graph classification, with an example in the field of Parkinson’s disease classification.
...
...

References

SHOWING 1-10 OF 12 REFERENCES
Recursive processing of cyclic graphs
  • M. Bianchini, M. Gori, F. Scarselli
  • Computer Science
    Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290)
  • 2002
TLDR
A methodology is proposed which allows us to map any cyclic directed graph into a "recursive-equivalent" tree, and the computational power of recursive networks is definitely established, also clarifying the underlying limitations of the model.
A recursive neural network model for processing directed acyclic graphs with labeled edges
TLDR
A new recursive neural network model is proposed which allows us to process directed acyclic graphs (DAGs) with labeled edges, relaxing the positional constraint and the correlated maximum outdegree limit and the results show that the new RNN model outperforms the standard RNN architecture, also allowing us to use a smaller number of free parameters.
A general framework for adaptive processing of data structures
TLDR
The framework described in this paper is an attempt to unify adaptive models like artificial neural nets and belief nets for the problem of processing structured information, where relations between data variables are expressed by directed acyclic graphs, where both numerical and categorical values coexist.
Supervised neural networks for the classification of structures
TLDR
It is shown that neural networks can, in fact, represent and classify structured patterns and all the supervised networks developed for the classification of sequences can, on the whole, be generalized to structures.
Relating Chemical Structure to Activity: An Application of the Neural Folding Architecture
TLDR
The main objective of this paper is to demonstrate that the FA can be successfully applied to approximate quantitative structure activity relationships (QSARs), which play an important role during a drug design process.
Logo Recognition by Recursive Neural Networks
TLDR
This paper proposes recognizing logo images by using an adaptive model referred to as recursive artificial neural network, which contains the topological structured information of logo and continuous values pertaining to each contour node in the contour-tree representation of logo image.
Face Spotting in Color Images using Recursive Neural Networks
TLDR
This paper proposes a novel approach to the solution of the face localization problem using recursive neural networks that assumes a graph–based representation of images that combines structural and sub–symbolic visual features.
Inductive Inference of Tree Automata by Recursive Neural Networks
TLDR
Recurrent neural networks are powerful learning machines capable of processing sequences but can conveniently be used to process also general data structures like trees and graphs, which opens the doors to a number of new very interesting applications previously unexplored.
Generalization of back-propagation to recurrent neural networks.
  • Pineda
  • Computer Science
    Physical review letters
  • 1987
TLDR
An adaptive neural network with asymmetric connections is introduced that bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.
An Introduction to Metric Spaces and Fixed Point Theory
Preface. METRIC SPACES. Introduction. Metric Spaces. Metric Contraction Principles. Hyperconvex Spaces. "Normal" Structures in Metric Spaces. BANACH SPACES. Banach Spaces: Introduction. Continuous
...
...