Be Consistent! Improving Procedural Text Comprehension using Label Consistency

@article{Du2019BeCI,
  title={Be Consistent! Improving Procedural Text Comprehension using Label Consistency},
  author={Xinya Du and Bhavana Dalvi Mishra and Niket Tandon and Antoine Bosselut and Wen-tau Yih and Peter Clark and Claire Cardie},
  journal={ArXiv},
  year={2019},
  volume={abs/1906.08942}
}
Highlight Information
Our goal is procedural text comprehension, namely tracking how the properties of entities (e.g., their location) change with time given a procedural text (e.g., a paragraph about photosynthesis, a recipe. [...] Key Method We present a new learning framework that leverages label consistency during training, allowing consistency bias to be built into the model. Evaluation on a standard benchmark dataset for procedural text, ProPara (Dalvi et al., 2018), shows that our approach significantly improves prediction…Expand Abstract

References

Publications referenced by this paper.
SHOWING 1-10 OF 27 REFERENCES

Long Short-Term Memory

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

AllenNLP: A Deep Semantic Natural Language Processing Platform

VIEW 2 EXCERPTS
HIGHLY INFLUENTIAL

Automatic differentiation in PyTorch

VIEW 2 EXCERPTS
HIGHLY INFLUENTIAL