An end-to-end graph convolutional kernel support vector machine

@article{Corcoran2020AnEG,
  title={An end-to-end graph convolutional kernel support vector machine},
  author={Padraig Corcoran},
  journal={Applied Network Science},
  year={2020},
  volume={5},
  pages={1-15}
}
  • P. Corcoran
  • Published 29 February 2020
  • Computer Science, Mathematics
  • Applied Network Science
A novel kernel-based support vector machine (SVM) for graph classification is proposed. The SVM feature space mapping consists of a sequence of graph convolutional layers, which generates a vector space representation for each vertex, followed by a pooling layer which generates a reproducing kernel Hilbert space (RKHS) representation for the graph. The use of a RKHS offers the ability to implicitly operate in this space using a kernel function without the computational complexity of explicitly… Expand
1 Citations
Prediction by Soft Computing, Planning, and Strategy Building of Aquatic Catch: Chilika Lagoon, Odisha, India
Introduction: The Chilika lagoon in south Odisha, India was ecologically degraded from 1985 onwards by reduction of its aquatic (fish + prawn + shrimp) catches along with reduction in salinity,Expand

References

SHOWING 1-10 OF 68 REFERENCES
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
TLDR
Learning with Kernels provides an introduction to SVMs and related kernel methods that provide all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms. Expand
On Graph Classification Networks, Datasets and Baselines
TLDR
It is shown that, despite the great complexity of these models, competitive performance is achieved by the simplest of models -- structure-blind MLP, single-layer GCN and fixed-weight GCN -- and it is proposed these be included as baselines in future. Expand
Neural Message Passing for Quantum Chemistry
TLDR
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels. Expand
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Chapters 2–7 make up Part II of the book: artificial neural networks. After introducing the basic concepts of neurons and artificial neuron learning rules in Chapter 2, Chapter 3 describes aExpand
Convolution kernels on discrete structures. Technical report
  • 1999
  • J Mach Learn Res
  • 2020
A Fair Comparison of Graph Neural Networks for Graph Classification
TLDR
By comparing GNNs with structure-agnostic baselines the authors provide convincing evidence that, on some datasets, structural information has not been exploited yet and can contribute to the development of the graph learning field, by providing a much needed grounding for rigorous evaluations of graph classification models. Expand
Artificial Intelligence and Statistics, Florida
  • 2020
GraKeL: A Graph Kernel Library in Python
TLDR
GraKeL is a library that unifies several graph kernels into a common framework that can be naturally combined with scikit-learn's modules to build a complete machine learning pipeline for tasks such as graph classification and clustering. Expand
A Comprehensive Survey on Graph Neural Networks
TLDR
This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns. Expand
...
1
2
3
4
5
...