Learning Knowledge Graphs for Question Answering through Conversational Dialog

@inproceedings{Hixon2015LearningKG,
  title={Learning Knowledge Graphs for Question Answering through Conversational Dialog},
  author={Ben Hixon and Peter Clark and Hannaneh Hajishirzi},
  booktitle={North American Chapter of the Association for Computational Linguistics},
  year={2015}
}
We describe how a question-answering system can learn about its domain from conversational dialogs. Our system learns to relate concepts in science questions to propositions in a fact corpus, stores new concepts and relations in a knowledge graph (KG), and uses the graph to solve questions. We are the first to acquire knowledge for question-answering from open, natural language dialogs without a fixed ontology or domain model that predetermines what users can say. Our relation-based strategies… 

Figures and Tables from this paper

Look before you Hop: Conversational Question Answering over Knowledge Graphs Using Judicious Context Expansion

ConVEX is an unsupervised method that can answer incomplete questions over a knowledge graph by maintaining conversation context using entities and predicates seen so far and automatically inferring missing or ambiguous pieces for follow-up questions.

Knowledge Graph Completion-based Question Selection for Acquiring Domain Knowledge through Dialogues

Two modifications to the KGC training are presented, creating pseudo entities having substrings of the names of the entities in the graph so that the entities whose names share substrings are connected and limiting the range of negative sampling, suggesting that the model trained with the modifications is capable of avoiding questions with incorrect content.

Combining Natural Logic and Shallow Reasoning for Question Answering

This work extends the breadth of inferences afforded by natural logic to include relational entailment and meronymy and trains an evaluation function – akin to gameplaying – to evaluate the expected truth of candidate premises on the fly.

Discovering Knowledge Graph Schema from Short Natural Language Text via Dialog

This work proposes a dialog strategy that looks to elicit the schema over as short a dialog as possible, and shows that this significantly reduces dialog complexity while engaging the expert in meaningful dialog.

Question Answering When Knowledge Bases are Incomplete

A typology of missing information in knowledge bases is formalized, and a dataset based on the Spider KB question answering dataset is presented, where it is shown that simple baselines fail to detect most of the unanswerable questions.

Open Information Extraction from Question-Answer Pairs

NeurON is described, a system for extracting tuples from question-answer pairs that combines distributed representations of a question and an answer to generate knowledge facts and is described on two real-world datasets that demonstrate that NeurON can find a significant number of new and interesting facts to extend a knowledge base compared to state-of-the-art OpenIE methods.

TION METHOD FOR USE IN NATURAL DIALOGUE

  • Computer Science
  • 2019
An end to end multi-stream deep learning architecture which learns unified embeddings for query-response pairs by leveraging contextual information from memory networks and syntactic information by incorporating Graph Convolution Networks over their dependency parse is proposed.

Augmenting Topic Aware Knowledge-Grounded Conversations with Dynamic Built Knowledge Graphs

This is the first attempt to dynamically form knowledge graphs between chatting topics to assist dialog topic management during a conversation and results manifest that the model can properly schedule conversational topics and pick suitable knowledge to generate informative responses comparing to several strong baselines.

Dialog-based Language Learning

This work studies dialog-based language learning, where supervision is given naturally and implicitly in the response of the dialog partner during the conversation, and shows that a novel model incorporating predictive lookahead is a promising approach for learning from a teacher's response.

Interactive Factual Knowledge Learning in Dialogues

  • Computer Science
  • 2020
This paper focuses on solving open world knowledge base completion via user interactions, which enables the proposed system to potentially serve as an engine for learning new knowledge during dialogues and shows the effectiveness of the proposed approach.
...

References

SHOWING 1-10 OF 33 REFERENCES

Learning situated knowledge bases through dialog

This work builds a system that solicits situational information from its users in a domain that provides information on events (seminar talks) to augment its knowledge base (covering an academic field) and finds that this knowledge is consistent and useful and that it provides reliable information to users.

Probabilistic enrichment of knowledge graph entities for relation detection in conversational understanding

This paper proposes new methods to assign weights to semantic graphs that reflect common usage types of the entities and their relations and shows that all weighting methods result in better performance in comparison to using the unweighted version of the semantic knowledge graph.

Modeling Biological Processes for Reading Comprehension

This paper focuses on a new reading comprehension task that requires complex reasoning over a single document, and demonstrates that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.

Bootstrapping Semantic Parsers from Conversations

This paper introduces a loss function to measure how well potential meanings match the conversation, and induces a weighted CCG grammar that could be used to automatically bootstrap the semantic analysis component in a complete dialog system.

Knowledge Acquisition Strategies for Goal-Oriented Dialog Systems

This paper proposes knowledge acquisition strategies for a dialog agent and shows their effectiveness, and the acquired knowledge can be shown to subsequently contribute to task completion.

Open Dialogue Management for Relational Databases

Evaluation of the system with simulated users shows that users with realistically limited domain knowledge have dialogues nearly as efficient as those of users with complete domain knowledge.

Commonsense Reasoning in and Over Natural Language

It is concluded that the flexibility of natural language makes it a highly suitable representation for achieving practical inferences over text, such as context finding, inference chaining, and conceptual analogy.

Generating Recommendation Dialogs by Extracting Information from User Reviews

This work presents a framework for generating and ranking fine-grained, highly relevant questions from user-generated reviews, and releases a new sentiment lexicon with 1329 adjectives for the restaurant domain.

Toward an Architecture for Never-Ending Language Learning

This work proposes an approach and a set of design principles for an intelligent computer agent that runs forever and describes a partial implementation of such a system that has already learned to extract a knowledge base containing over 242,000 beliefs.