Grammatical Error Correction through Round-Trip Machine Translation

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Grammatical Error Correction through Round-Trip Machine Translation. / Kementchedjhieva, Yova; Søgaard, Anders.

Findings of the Association for Computational Linguistics: EACL 2023. Association for Computational Linguistics (ACL), 2023. p. 2208-2215.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Kementchedjhieva, Y & Søgaard, A 2023, Grammatical Error Correction through Round-Trip Machine Translation. in Findings of the Association for Computational Linguistics: EACL 2023. Association for Computational Linguistics (ACL), pp. 2208-2215, 17th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2023 - Findings of EACL 2023, Dubrovnik, Croatia, 02/05/2023. https://doi.org/10.18653/v1/2023.findings-eacl.165

APA

Kementchedjhieva, Y., & Søgaard, A. (2023). Grammatical Error Correction through Round-Trip Machine Translation. In Findings of the Association for Computational Linguistics: EACL 2023 (pp. 2208-2215). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-eacl.165

Vancouver

Kementchedjhieva Y, Søgaard A. Grammatical Error Correction through Round-Trip Machine Translation. In Findings of the Association for Computational Linguistics: EACL 2023. Association for Computational Linguistics (ACL). 2023. p. 2208-2215 https://doi.org/10.18653/v1/2023.findings-eacl.165

Author

Kementchedjhieva, Yova ; Søgaard, Anders. / Grammatical Error Correction through Round-Trip Machine Translation. Findings of the Association for Computational Linguistics: EACL 2023. Association for Computational Linguistics (ACL), 2023. pp. 2208-2215

Bibtex

@inproceedings{10d4eca5a2bb4cda86ee5a23bbcf5b46,
title = "Grammatical Error Correction through Round-Trip Machine Translation",
abstract = "Machine translation (MT) operates on the premise of an interlingua which abstracts away from surface form while preserving meaning. A decade ago the idea of using round-trip MT to guide grammatical error correction was proposed as a way to abstract away from potential errors in surface forms (Madnani et al., 2012). At the time, it did not pan out due to the low quality of MT systems of the day. Today much stronger MT systems are available so we re-evaluate this idea across five languages and models of various sizes. We find that for extra large models input augmentation through round-trip MT has little to no effect. For more {\textquoteleft}workable{\textquoteright} model sizes, however, it yields consistent improvements, sometimes bringing the performance of a base or large model up to that of a large or xl model, respectively. The round-trip translation comes at a computational cost though, so one would have to determine whether to opt for a larger model or for input augmentation on a case-by-case basis.",
author = "Yova Kementchedjhieva and Anders S{\o}gaard",
year = "2023",
doi = "10.18653/v1/2023.findings-eacl.165",
language = "English",
pages = "2208--2215",
booktitle = "Findings of the Association for Computational Linguistics: EACL 2023",
publisher = "Association for Computational Linguistics (ACL)",
address = "United States",
note = "17th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2023 - Findings of EACL 2023 ; Conference date: 02-05-2023 Through 06-05-2023",

}

RIS

TY - GEN

T1 - Grammatical Error Correction through Round-Trip Machine Translation

AU - Kementchedjhieva, Yova

AU - Søgaard, Anders

PY - 2023

Y1 - 2023

N2 - Machine translation (MT) operates on the premise of an interlingua which abstracts away from surface form while preserving meaning. A decade ago the idea of using round-trip MT to guide grammatical error correction was proposed as a way to abstract away from potential errors in surface forms (Madnani et al., 2012). At the time, it did not pan out due to the low quality of MT systems of the day. Today much stronger MT systems are available so we re-evaluate this idea across five languages and models of various sizes. We find that for extra large models input augmentation through round-trip MT has little to no effect. For more ‘workable’ model sizes, however, it yields consistent improvements, sometimes bringing the performance of a base or large model up to that of a large or xl model, respectively. The round-trip translation comes at a computational cost though, so one would have to determine whether to opt for a larger model or for input augmentation on a case-by-case basis.

AB - Machine translation (MT) operates on the premise of an interlingua which abstracts away from surface form while preserving meaning. A decade ago the idea of using round-trip MT to guide grammatical error correction was proposed as a way to abstract away from potential errors in surface forms (Madnani et al., 2012). At the time, it did not pan out due to the low quality of MT systems of the day. Today much stronger MT systems are available so we re-evaluate this idea across five languages and models of various sizes. We find that for extra large models input augmentation through round-trip MT has little to no effect. For more ‘workable’ model sizes, however, it yields consistent improvements, sometimes bringing the performance of a base or large model up to that of a large or xl model, respectively. The round-trip translation comes at a computational cost though, so one would have to determine whether to opt for a larger model or for input augmentation on a case-by-case basis.

U2 - 10.18653/v1/2023.findings-eacl.165

DO - 10.18653/v1/2023.findings-eacl.165

M3 - Article in proceedings

SP - 2208

EP - 2215

BT - Findings of the Association for Computational Linguistics: EACL 2023

PB - Association for Computational Linguistics (ACL)

T2 - 17th Conference of the European Chapter of the Association for Computational Linguistics, EACL 2023 - Findings of EACL 2023

Y2 - 2 May 2023 through 6 May 2023

ER -

ID: 381561609