• Publications
  • Influence
Global inference for sentence compression : an integer linear programming approach
TLDR
This work shows how previous formulations of sentence compression can be recast as ILPs and extend these models with novel global constraints to infer globally optimal compressions in the presence of linguistically motivated constraints.
Driving Semantic Parsing from the World’s Response
TLDR
This paper develops two novel learning algorithms capable of predicting complex structures which only rely on a binary feedback signal based on the context of an external world and reformulates the semantic parsing problem to reduce the dependency of the model on syntactic patterns, thus allowing the parser to scale better using less supervision.
Incremental Integer Linear Programming for Non-projective Dependency Parsing
TLDR
This work presents an approach which solves the problem incrementally, thus it avoids creating intractable integer linear programs and shows how the addition of linguistically motivated constraints can yield a significant improvement over state-of-the-art.
Discourse Constraints for Document Compression
TLDR
A discourse-informed model which is capable of producing document compressions that are coherent and informative is presented, inspired by theories of local coherence and formulated within the framework of integer linear programming.
Modelling Compression with Discourse Constraints
TLDR
A discourse informed model which is capable of producing document compressions that are coherent and informative is presented, inspired by theories of local coherence and formulated within the framework of Integer Linear Programming.
Models for Sentence Compression: A Comparison across Domains, Training Requirements and Evaluation Measures
TLDR
This paper provides a novel comparison between a supervised constituent-based and an weakly supervised word-based compression algorithm and examines how these models port to different domains (written vs. spoken text).
Constraint-Based Sentence Compression: An Integer Programming Approach
TLDR
This work develops an integer programming formulation and infer globally optimal compressions in the face of linguistically motivated constraints and shows that such a formulation allows for relatively simple and knowledge-lean compression models that do not require parallel corpora or large-scale resources.
An NLP Curator (or: How I Learned to Stop Worrying and Love NLP Pipelines)
TLDR
Curator, an NLP management framework designed to address some common problems and inefficiencies associated with building NLP process pipelines; and Edison, a NLP data structure library in Java that provides streamlined interactions with Curator and offers a range of useful supporting functionality.
Confidence Driven Unsupervised Semantic Parsing
TLDR
It is argued that a semantic parser can be trained effectively without annotated data, and an unsupervised learning algorithm is introduced, which takes a self training approach driven by confidence estimation.
...
1
2
...