• Computer Science
  • Published in ArXiv 2019

AMR-to-Text Generation with Cache Transition Systems

@article{Jin2019AMRtoTextGW,
  title={AMR-to-Text Generation with Cache Transition Systems},
  author={Lisa Jin and Daniel Gildea},
  journal={ArXiv},
  year={2019},
  volume={abs/1912.01682}
}
Text generation from AMR involves emitting sentences that reflect the meaning of their AMR annotations. Neural sequence-to-sequence models have successfully been used to decode strings from flattened graphs (e.g., using depth-first or random traversal). Such models often rely on attention-based decoders to map AMR node to English token sequences. Instead of linearizing AMR, we directly encode its graph structure and delegate traversal to the decoder. To enforce a sentence-aligned graph… CONTINUE READING

Topics from this paper.