A semantic matching energy function for learning with multi-relational data


Large-scale relational learning becomes crucial for handling the huge amounts of structured data generated daily in many application domains ranging from computational biology or information retrieval, to natural language processing. In this paper, we present a new neural network architecture designed to embed multi-relational graphs into a flexible continuous vector space in which the original data is kept and enhanced. The network is trained to encode the semantics of these graphs in order to assign high probabilities to plausible components. We empirically show that it reaches competitive performance in link prediction on standard datasets from the literature as well as on data from a real-world knowledge base (WordNet). In addition, we present how our method can be applied to perform word-sense disambiguation in a context of open-text semantic parsing, where the goal is to learn to assign a structured meaning representation to almost any sentence of free text, demonstrating that it can scale up to tens of thousands of nodes and thousands of types of relation.

DOI: 10.1007/s10994-013-5363-6

Extracted Key Phrases

2 Figures and Tables

Showing 1-10 of 98 extracted citations
Citations per Year

132 Citations

Semantic Scholar estimates that this publication has received between 103 and 176 citations based on the available data.

See our FAQ for additional information.