An end-to-end graph convolutional kernel support vector machine

@article{Corcoran2020AnEG,
  title={An end-to-end graph convolutional kernel support vector machine},
  author={Padraig Corcoran},
  journal={Applied Network Science},
  year={2020},
  volume={5},
  pages={1-15}
}
  • P. Corcoran
  • Published 29 February 2020
  • Computer Science, Mathematics
  • Applied Network Science
A novel kernel-based support vector machine (SVM) for graph classification is proposed. The SVM feature space mapping consists of a sequence of graph convolutional layers, which generates a vector space representation for each vertex, followed by a pooling layer which generates a reproducing kernel Hilbert space (RKHS) representation for the graph. The use of a RKHS offers the ability to implicitly operate in this space using a kernel function without the computational complexity of explicitly… 
1 Citations
Prediction by Soft Computing, Planning, and Strategy Building of Aquatic Catch: Chilika Lagoon, Odisha, India
Introduction: The Chilika lagoon in south Odisha, India was ecologically degraded from 1985 onwards by reduction of its aquatic (fish + prawn + shrimp) catches along with reduction in salinity,

References

SHOWING 1-10 OF 68 REFERENCES
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
TLDR
Learning with Kernels provides an introduction to SVMs and related kernel methods that provide all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms.
On Graph Classification Networks, Datasets and Baselines
TLDR
It is shown that, despite the great complexity of these models, competitive performance is achieved by the simplest of models -- structure-blind MLP, single-layer GCN and fixed-weight GCN -- and it is proposed these be included as baselines in future.
Neural Message Passing for Quantum Chemistry
TLDR
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Chapters 2–7 make up Part II of the book: artificial neural networks. After introducing the basic concepts of neurons and artificial neuron learning rules in Chapter 2, Chapter 3 describes a
Convolution kernels on discrete structures. Technical report
  • 1999
Deep Learning on Graphs: A Survey
TLDR
This survey comprehensively review the different types of deep learning methods on graphs by dividing the existing methods into five categories based on their model architectures and training strategies: graph recurrent neural networks, graph convolutional networks,graph autoencoders, graph reinforcement learning, and graph adversarial methods.
  • J Mach Learn Res
  • 2020
A Fair Comparison of Graph Neural Networks for Graph Classification
TLDR
By comparing GNNs with structure-agnostic baselines the authors provide convincing evidence that, on some datasets, structural information has not been exploited yet and can contribute to the development of the graph learning field, by providing a much needed grounding for rigorous evaluations of graph classification models.
Artificial Intelligence and Statistics, Florida
  • 2020
GraKeL: A Graph Kernel Library in Python
TLDR
GraKeL is a library that unifies several graph kernels into a common framework that can be naturally combined with scikit-learn's modules to build a complete machine learning pipeline for tasks such as graph classification and clustering.
...
1
2
3
4
5
...