Share This Author
struc2vec: Learning Node Representations from Structural Identity
Struc2vec, a novel and flexible framework for learning latent representations for the structural identity of nodes, is presented, which improves performance on classification tasks that depend more on structural identity.
Ranking Generated Summaries by Correctness: An Interesting but Challenging Application for Natural Language Inference
- Tobias Falke, Leonardo F. R. Ribeiro, Prasetya Ajie Utama, Ido Dagan, Iryna Gurevych
- Computer ScienceACL
- 27 May 2019
This paper evaluates summaries produced by state-of-the-art models via crowdsourcing and shows that such errors occur frequently, in particular with more abstractive models, which leads to an interesting downstream application for entailment models.
Investigating Pretrained Language Models for Graph-to-Text Generation
- Leonardo F. R. Ribeiro, Martin Schmitt, Hinrich Schütze, Iryna Gurevych
- Computer ScienceNLP4CONVAI
- 16 July 2020
It is suggested that the PLMs benefit from similar facts seen during pretraining or fine-tuning, such that they perform well even when the input graph is reduced to a simple bag of node and edge labels.
Enhancing AMR-to-Text Generation with Dual Graph Representations
A novel graph-to-sequence model that encodes different but complementary perspectives of the structural information contained in the AMR graph, learning parallel top-down and bottom-up representations of nodes capturing contrasting views of the graph.
Modeling Global and Local Node Contexts for Text Generation from Knowledge Graphs
- Leonardo F. R. Ribeiro, Yue Zhang, Claire Gardent, Iryna Gurevych
- Computer ScienceTransactions of the Association for Computational…
- 29 January 2020
This work gathers both encoding strategies, proposing novel neural models that encode an input graph combining both global and local node contexts, in order to learn better contextualized node embeddings.
Structural Adapters in Pretrained Language Models for AMR-to-Text Generation
The benefits of explicitly encoding graph structure into PLMs using StructAdapt are empirically shown, outperforming the state of the art on two AMR-to-text datasets, training only 5.1% of the PLM parameters.
Common Sense or World Knowledge? Investigating Adapter-Based Knowledge Injection into Pretrained Transformers
- Anne Lauscher, Olga Majewska, Leonardo F. R. Ribeiro, Iryna Gurevych, N. Rozanov, Goran Glavavs
- Computer ScienceDEELIO
- 24 May 2020
A deeper analysis reveals that the adapter-based models substantially outperform BERT on inference tasks that require the type of conceptual knowledge explicitly present in ConceptNet and its corresponding Open Mind Common Sense corpus.
Ranking lawyers using a social network induced by legal cases
- Leonardo F. R. Ribeiro, Daniel R. Figueiredo
- LawJournal of the Brazilian Computer Society
- 4 April 2017
This study suggests that the network structure induced by lawyers contains useful information concerning their effectiveness within the community.
Modeling Graph Structure via Relative Position for Better Text Generation from Knowledge Graphs
- Martin Schmitt, Leonardo F. R. Ribeiro, Philipp Dufter, Iryna Gurevych, Hinrich Schütze
- Computer ScienceArXiv
- 16 June 2020
The Graformer is evaluated on two graph-to-text generation benchmarks, the AGENDA dataset and the WebNLG challenge dataset, where it achieves strong performance while using significantly less parameters than other approaches.
Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation
- Leonardo F. R. Ribeiro, Jonas Pfeiffer, Yue Zhang, Iryna Gurevych
- Computer ScienceEMNLP
- 8 September 2021
It is found that combining both complementary sources of information further improves multilingual AMR-to-text generation and surpass the previous state of the art for German, Italian, Spanish, and Chinese by a large margin.