Semantic Parsing: Syntactic assurance to target sentence using LSTM Encoder CFG-Decoder
@article{Luz2018SemanticPS, title={Semantic Parsing: Syntactic assurance to target sentence using LSTM Encoder CFG-Decoder}, author={Fabiano Ferreira Luz and Marcelo Finger}, journal={ArXiv}, year={2018}, volume={abs/1807.07108} }
Semantic parsing can be defined as the process of mapping natural language sentences into a machine interpretable, formal representation of its meaning. Semantic parsing using LSTM encoder-decoder neural networks have become promising approach. However, human automated translation of natural language does not provide grammaticality guarantees for the sentences generate such a guarantee is particularly important for practical cases where a data base query can cause critical errors if the…
References
SHOWING 1-10 OF 20 REFERENCES
Language to Logical Form with Neural Attention
- Computer ScienceACL
- 2016
This paper presents a general method based on an attention-enhanced encoder-decoder model that encode input utterances into vector representations, and generate their logical forms by conditioning the output sequences or trees on the encoding vectors.
Deterministic Statistical Mapping of Sentences to Underspecified Semantics
- Computer ScienceIWCS
- 2011
The particular choice of algorithms used means that the trained mapping is deterministic (in the sense of deterministic parsing), paving the way for large-scale text-to-semantic mapping.
Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars
- Computer ScienceUAI
- 2005
A learning algorithm is described that takes as input a training set of sentences labeled with expressions in the lambda calculus and induces a grammar for the problem, along with a log-linear model that represents a distribution over syntactic and semantic analyses conditioned on the input sentence.
Recursive Neural Networks for Learning Logical Semantics
- Computer ScienceArXiv
- 2014
This work evaluates whether each of two classes of neural model can correctly learn relationships such as entailment and contradiction between pairs of sentences, and finds that the plain RNN achieves only mixed results on all three experiments, whereas the stronger RNTN model generalizes well in every setting and appears capable of learning suitable representations for natural language logical inference.
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
- Computer ScienceEMNLP
- 2014
Qualitatively, the proposed RNN Encoder‐Decoder model learns a semantically and syntactically meaningful representation of linguistic phrases.
Using Multiple Clause Constructors in Inductive Logic Programming for Semantic Parsing
- Computer ScienceECML
- 2001
Preliminary results demonstrated that an approach which combines different learning methods in inductive logic programming (ILP) to allow a learner to produce more expressive hypotheses than that of each individual learner is promising.
Querix: A Natural Language Interface to Query Ontologies Based on Clarification Dialogs
- Computer Science
- 2006
This paper presents Querix, a domain-independent natural language interface for the Semantic Web that allows queries in natural language, thereby asking the user for clarification in case of ambiguities.
Neural Machine Translation by Jointly Learning to Align and Translate
- Computer ScienceICLR
- 2015
It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Using Linguistic Analysis to Translate Arabic Natural Language Queries to SPARQL
- Computer ScienceArXiv
- 2015
This paper presents a domain-independent approach to translate Arabic NL queries to SPARQL by leveraging linguistic analysis and uses a language parser to extract NPs and the relations from Arabic parse trees and match them to the underlying ontology.
SPARQL as a Foreign Language
- Computer ScienceSEMANTiCS Posters&Demos
- 2017
Preliminary results show that Neural SPARQL Machines are a promising approach for Question Answering on Linked Data, as they can deal with known problems such as vocabulary mismatch and perform graph pattern composition.