Integrating bi-directional contexts in a generative kernel for trees

Abstract

Context is essential to evaluate an atomic piece of information composing an articulated structured sample. A particular context captures different structural information with respect to an alternative context. The paper introduces a generative kernel that easily and effectively combines the structural information captured by generative tree models characterized by different contextual capabilities. The proposed approach exploits the idea of hidden states multisets to realize a tree encoding that takes into account both the summarized information on the path leading to a node (i.e. a top-down context) as well as the information on how substructures are composed to create a subtree rooted on a node (bottom-up context). An thorough experimental analysis is provided, showing that the bi-directional approach incorporating top-down and bottom-up contexts yields to superior performances with respect to the unidirectional contexts alone, achieving state of the art results on challenging tree classification benchmarks.

DOI: 10.1109/IJCNN.2014.6889768

4 Figures and Tables

Cite this paper

@article{Bacciu2014IntegratingBC, title={Integrating bi-directional contexts in a generative kernel for trees}, author={Davide Bacciu and Alessio Micheli and Alessandro Sperduti}, journal={2014 International Joint Conference on Neural Networks (IJCNN)}, year={2014}, pages={4145-4151} }