Share This Author
The Never-Ending Language Learner is described, which achieves some of the desired properties of a never-ending learner, and lessons learned are discussed.
Learning a Neural Semantic Parser from User Feedback
- Srini Iyer, Ioannis Konstas, Alvin Cheung, J. Krishnamurthy, Luke Zettlemoyer
- Computer ScienceACL
- 27 April 2017
We present an approach to rapidly and easily build natural language interfaces to databases for new domains, whose performance improves over time based on user feedback, and requires minimal…
Neural Semantic Parsing with Type Constraints for Semi-Structured Tables
A new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables with a state-of-the-art accuracy and type constraints and entity linking are valuable components to incorporate in neural semantic parsers.
Incorporating Vector Space Similarity in Random Walk Inference over Knowledge Bases
- Matt Gardner, P. Talukdar, J. Krishnamurthy, Tom Michael Mitchell
- Computer ScienceEMNLP
- 1 October 2014
A new technique for combining KB relations and surface text into a single graph representation that is much more compact than graphs used in prior work is presented, and how to incorporate vector space similarity into random walk inference over KBs is described.
Weakly Supervised Training of Semantic Parsers
This work presents a method for training a semantic parser using only a knowledge base and an unlabeled text corpus, without any individually annotated sentences, and demonstrates recovery of this richer structure by extracting logical forms from natural language queries against Freebase.
Jointly Learning to Parse and Perceive: Connecting Natural Language to the Physical World
This paper introduces Logical Semantics with Perception (LSP), a model for grounded language acquisition that learns to map natural language statements to their referents in a physical environment and finds that LSP outperforms existing, less expressive models that cannot represent relational language.
Instructable Intelligent Personal Agent
An intelligent personal agent that users can teach to perform new action sequences to achieve new commands, using solely natural language interaction, to demonstrate the potential of natural language instruction as a significant, under-explored paradigm for machine learning.
Semantic Parsing to Probabilistic Programs for Situated Question Answering
P3 is presented, a novel situated question answering model that can use background knowledge and global features of the question/environment interpretation while retaining efficient approximate inference, to treat semantic parses as probabilistic programs that execute nondeterministically and whose possible executions represent environmental uncertainty.
Vector Space Semantic Parsing: A Framework for Compositional Vector Space Models
We present vector space semantic parsing (VSSP), a framework for learning compositional models of vector space semantics. Our framework uses Combinatory Categorial Grammar (CCG) to define a…
Task-Oriented Dialogue as Dataflow Synthesis
- Jacob Andreas, J. Bufe, Alexander Zotov
- Computer ScienceTransactions of the Association for Computational…
- 1 September 2020
An approach to task-oriented dialogue in which dialogue state is represented as a dataflow graph, which enables the expression and manipulation of complex user intents, and explicit metacomputation makes these intents easier for learned models to predict.