• Corpus ID: 243833061

A Syntax-Guided Grammatical Error Correction Model with Dependency Tree Correction

@article{Wan2021ASG,
  title={A Syntax-Guided Grammatical Error Correction Model with Dependency Tree Correction},
  author={Zhaohong Wan and Xiaojun Wan},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.03294}
}
Grammatical Error Correction (GEC) is a task of detecting and correcting grammatical errors in sentences. Recently, neural machine translation systems have become popular approaches for this task. However, these methods lack the use of syntactic knowledge which plays an important role in the correction of grammatical errors. In this work, we propose a syntax-guided GEC model (SG-GEC) which adopts the graph attention mechanism to utilize the syntactic knowledge of dependency trees. Considering… 

Figures and Tables from this paper

SynGEC: Syntax-Enhanced Grammatical Error Correction with a Tailored GEC-Oriented Parser

This work proposes a syntax-enhanced grammatical error correction (GEC) approach that effectively incorporates dependency syntactic information into the encoder part of GEC models and consistently and substantially outperforms strong baselines and achieves competitive performance.

CSynGEC: Incorporating Constituent-based Syntax for Grammatical Error Correction with a Tailored GEC-Oriented Parser

This work proposes an extended constituent- based syntax scheme to accommodate errors in ungrammatical sentences and investigates the integration of constituent-based and dependency-based syntax for GEC in two ways, finding that the former method improves recall over using one standalone syntax formalism while the latter improves precision.

References

SHOWING 1-10 OF 43 REFERENCES

Grammatical error correction using neural machine translation

This paper presents the first study using neural machine translation (NMT) for grammatical error correction (GEC). We propose a twostep approach to handle the rare word problem in NMT, which has been

Weakly Supervised Grammatical Error Correction using Iterative Decoding

An approach to Grammatical Error Correction (GEC) that is effective at making use of models trained on large amounts of weakly supervised bitext, and an iterative decoding strategy that is tailored to the loosely-supervised nature of the Wikipedia training corpus.

A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer Learning

This work first generates erroneous versions of large unannotated corpora using a realistic noising function, which are sub-sequently used to pre-train Transformer models, and adapt these models to the domain and style of the test set.

Improving Grammatical Error Correction via Pre-Training a Copy-Augmented Architecture with Unlabeled Data

This paper proposes a copy-augmented architecture for the GEC task by copying the unchanged words from the source sentence to the target sentence by fully pre-training a sequence to sequence model.

Corpora Generation for Grammatical Error Correction

It is demonstrated that neural GEC models trained using either type of corpora give similar performance, and systematic analysis is presented that compares the two approaches to data generation and highlights the effectiveness of ensembling.

Approaching Neural Grammatical Error Correction as a Low-Resource Machine Translation Task

The combined effects of adding source-side noise, domain-adaptation techniques, a GEC-specific training-objective, transfer learning with monolingual data, and ensembling of independently trained GEC models and language models result in better than state-of-the-art neural G EC models that outperform previously best neural GEC systems.

A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction

By ensembling multiple models, and incorporating an N-gram language model and edit features via rescoring, this novel method becomes the first neural approach to outperform the current state-of-the-art statistical machine translation-based approach, both in terms of grammaticality and fluency.

Better Evaluation for Grammatical Error Correction

This work presents a novel method for evaluating grammatical error correction that is an algorithm for efficiently computing the sequence of phrase-level edits between a source sentence and a system hypothesis that achieves the highest overlap with the gold-standard annotation.

GECToR – Grammatical Error Correction: Tag, Not Rewrite

In this paper, we present a simple and efficient GEC sequence tagger using a Transformer encoder. Our system is pre-trained on synthetic data and then fine-tuned in two stages: first on errorful

Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic Data

This work proposes a simple and surprisingly effective unsupervised synthetic error generation method based on confusion sets extracted from a spellchecker to increase the amount of training data.