Generating Natural Language Question-Answer Pairs from a Knowledge Graph Using a RNN Based Question Generation Model

@inproceedings{Khapra2017GeneratingNL,
  title={Generating Natural Language Question-Answer Pairs from a Knowledge Graph Using a RNN Based Question Generation Model},
  author={Mitesh M. Khapra and Dinesh Raghu and Sachindra Joshi and Sathish Reddy},
  booktitle={Conference of the European Chapter of the Association for Computational Linguistics},
  year={2017}
}
In recent years, knowledge graphs such as Freebase that capture facts about entities and relationships between them have been used actively for answering factoid questions. [] Key Method To generate such QA pairs, we first extract a set of keywords from entities and relationships expressed in a triple stored in the knowledge graph. From each such set, we use a subset of keywords to generate a natural language question that has a unique answer.

Figures and Tables from this paper

Generating Complex Questions from Knowledge Graphs with Query Graphs

  • Zimu Wang
  • Computer Science
    2022 IEEE 10th International Conference on Information, Communication and Networks (ICICN)
  • 2022
This paper proposes a novel framework to conduct the KGQG task, which consists of two stages: query graph construction and graph-to-question generation, and validate the framework on two compli-cated datasets designed for complex questioning and answering.

Calculating Question Similarity is Enough: A New Method for KBQA Tasks

The major novelty lies in the design of the new method, wherein the knowledge enhanced T5 (kT5) model aims to generate natural language QA pairs based on Knowledge Graph triples and directly solve the QA by only retrieving the synthetic dataset.

PathQG: Neural Question Generation from Facts

This paper presents a novel task of question generation given a query path in the knowledge graph constructed from the input text, and formulate query representation learning as a sequence labeling problem for identifying the involved facts to form a query and employ an RNN-based generator for question generation.

Toward Subgraph Guided Knowledge Graph Question Generation with Graph Neural Networks

This work proposes to apply a bidirectional Graph2Seq model to encode the KG subgraph, and enhances the RNN decoder with node-level copying mechanism to allow directly copying node attributes from the input graph to the output question.

Thematic Question Generation over Knowledge Bases

This paper developed a template based approach to generate questions, allowing complex questions generation from binary and n-ary statements, and developed an approach that reverts templates used for questions answering, allowing us to import more than 2000 templates.

Natural Answer Generation with QA Pairs Using Sequence to Sequence Model

This paper applies Sequence to Sequence Model based on LSTM in Chinese natural answer generation to build such a model that can learn from question-answer pairs, and finally generate natural answer sentences.

Addressing Semantic Drift in Question Generation for Semi-Supervised Question Answering

This paper proposes two semantics-enhanced rewards obtained from downstream question paraphrasing and question answering tasks to regularize the QG model to generate semantically valid questions, and proposes a QA-based evaluation method which measures the model’s ability to mimic human annotators in generating QA training data.

Generating Questions from Wikidata Triples

This work revisits KBQG – using pre training, a new (triple, question) dataset and taking question type into account – and shows that this approach outperforms previous work both in a standard and in a zero-shot setting.

Semantic Understanding of Natural Language Stories for Near Human Question Answering

It is shown that translating stories into knowledge graphs in RDF, and then restating the natural language questions into SPARQL to answer queries can be successful if the RDF graph is augmented with an ontology and an inference engine.

Meta-CQG: A Meta-Learning Framework for Complex Question Generation over Knowledge Bases

A meta-trained generator can acquire universal and transferable meta-knowledge and quickly adapt to long-tailed samples through a few most related training samples and design a self-supervised graph retriever to learn distributed representations for samples, and contrastive learning is leveraged to improve the learned representations.
...

References

SHOWING 1-10 OF 31 REFERENCES

Natural language question answering over RDF: a graph data driven approach

A semantic query graph is proposed to model the query intention in the natural language question in a structural way, based on which, RDF Q/A is reduced to subgraph matching problem and resolves the ambiguity of natural language questions at the time when matches of query are found.

Question Answering over Linked Data Using First-order Logic

This work formulate the knowledge for resolving the ambiguities in the main three steps of QALD (phrase detection, phrase-tosemantic-item mapping and semantic item grouping) as first-order logic clauses in a Markov Logic Network.

Automation of Question Generation From Sentences

A system that automates generation of questions from a sentence that will generate all possible questions which this sentence contain these questions answers is considered.

Automatically Generating Questions from Queries for Community-based Question Answering

Experimental results show that, the precision of 1-best and 5best generated questions is 67% and 61%, respectively, which outperforms a baseline method that directly retrieves questions for queries in a cQA site search engine.

Generating Quiz Questions from Knowledge Graphs

This work proposes an approach to generate natural language questions from knowledge graphs such as DBpedia and YAGO by selecting a query answer, generating a SPARQL query having the answer as its sole result, before verbalizing the question.

Robust question answering over the web of linked data

This system translates user questions into an extended form of structured SPARQL queries, with text predicates attached to triple patterns, based on a novel optimization model cast into an integer linear program for joint decomposition and disambiguation of the user question.

From query to question in one click: suggesting synthetic questions to searchers

This work introduces a learning-based approach that improves not only the relevance of the suggested questions to the original query, but also their grammatical correctness, and puts a special emphasis on increasing the diversity of suggestions via a novel diversification mechanism.

Automatic Question Generation from Queries

This paper proposes using user generated questions along with search engine query logs to create a question generation shared task that aims to automatically generate questions given a query.

Good Question! Statistical Ranking for Question Generation

This work uses manually written rules to perform a sequence of general purpose syntactic transformations to turn declarative sentences into questions, which are ranked by a logistic regression model trained on a small, tailored dataset consisting of labeled output from the system.

Semantic Parsing via Paraphrasing

This paper presents two simple paraphrase models, an association model and a vector space model, and trains them jointly from question-answer pairs, improving state-of-the-art accuracies on two recently released question-answering datasets.