Word Order Does Matter: (And Shuffled Language Models Know It)

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Word Order Does Matter : (And Shuffled Language Models Know It). / Abdou, Mostafa; Ravishankar, Vinit; Kulmizev, Artur; Søgaard, Anders.

ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). ed. / Smaranda Muresan; Preslav Nakov; Aline Villavicencio. Association for Computational Linguistics (ACL), 2022. p. 6907-6919.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Abdou, M, Ravishankar, V, Kulmizev, A & Søgaard, A 2022, Word Order Does Matter: (And Shuffled Language Models Know It). in S Muresan, P Nakov & A Villavicencio (eds), ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). Association for Computational Linguistics (ACL), pp. 6907-6919, 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022, Dublin, Ireland, 22/05/2022. https://doi.org/10.18653/v1/2022.acl-long.476

APA

Abdou, M., Ravishankar, V., Kulmizev, A., & Søgaard, A. (2022). Word Order Does Matter: (And Shuffled Language Models Know It). In S. Muresan, P. Nakov, & A. Villavicencio (Eds.), ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (pp. 6907-6919). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.acl-long.476

Vancouver

Abdou M, Ravishankar V, Kulmizev A, Søgaard A. Word Order Does Matter: (And Shuffled Language Models Know It). In Muresan S, Nakov P, Villavicencio A, editors, ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). Association for Computational Linguistics (ACL). 2022. p. 6907-6919 https://doi.org/10.18653/v1/2022.acl-long.476

Author

Abdou, Mostafa ; Ravishankar, Vinit ; Kulmizev, Artur ; Søgaard, Anders. / Word Order Does Matter : (And Shuffled Language Models Know It). ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers). editor / Smaranda Muresan ; Preslav Nakov ; Aline Villavicencio. Association for Computational Linguistics (ACL), 2022. pp. 6907-6919

Bibtex

@inproceedings{87f8cfdac2e1492a818f00fb2df6dc3f,
title = "Word Order Does Matter: (And Shuffled Language Models Know It)",
abstract = "Recent studies have shown that language models pretrained and/or fine-tuned on randomly permuted sentences exhibit competitive performance on GLUE, putting into question the importance of word order information. Somewhat counter-intuitively, some of these studies also report that position embeddings appear to be crucial for models' good performance with shuffled text. We probe these language models for word order information and investigate what position embeddings learned from shuffled text encode, showing that these models retain information pertaining to the original, naturalistic word order. We show this is in part due to a subtlety in how shuffling is implemented in previous work - before rather than after subword segmentation. Surprisingly, we find even Language models trained on text shuffled after subword segmentation retain some semblance of information about word order because of the statistical dependencies between sentence length and unigram probabilities. Finally, we show that beyond GLUE, a variety of language understanding tasks do require word order information, often to an extent that cannot be learned through fine-tuning.",
author = "Mostafa Abdou and Vinit Ravishankar and Artur Kulmizev and Anders S{\o}gaard",
note = "Publisher Copyright: {\textcopyright} 2022 Association for Computational Linguistics.; 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022 ; Conference date: 22-05-2022 Through 27-05-2022",
year = "2022",
doi = "10.18653/v1/2022.acl-long.476",
language = "English",
pages = "6907--6919",
editor = "Smaranda Muresan and Preslav Nakov and Aline Villavicencio",
booktitle = "ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)",
publisher = "Association for Computational Linguistics (ACL)",
address = "United States",

}

RIS

TY - GEN

T1 - Word Order Does Matter

T2 - 60th Annual Meeting of the Association for Computational Linguistics, ACL 2022

AU - Abdou, Mostafa

AU - Ravishankar, Vinit

AU - Kulmizev, Artur

AU - Søgaard, Anders

N1 - Publisher Copyright: © 2022 Association for Computational Linguistics.

PY - 2022

Y1 - 2022

N2 - Recent studies have shown that language models pretrained and/or fine-tuned on randomly permuted sentences exhibit competitive performance on GLUE, putting into question the importance of word order information. Somewhat counter-intuitively, some of these studies also report that position embeddings appear to be crucial for models' good performance with shuffled text. We probe these language models for word order information and investigate what position embeddings learned from shuffled text encode, showing that these models retain information pertaining to the original, naturalistic word order. We show this is in part due to a subtlety in how shuffling is implemented in previous work - before rather than after subword segmentation. Surprisingly, we find even Language models trained on text shuffled after subword segmentation retain some semblance of information about word order because of the statistical dependencies between sentence length and unigram probabilities. Finally, we show that beyond GLUE, a variety of language understanding tasks do require word order information, often to an extent that cannot be learned through fine-tuning.

AB - Recent studies have shown that language models pretrained and/or fine-tuned on randomly permuted sentences exhibit competitive performance on GLUE, putting into question the importance of word order information. Somewhat counter-intuitively, some of these studies also report that position embeddings appear to be crucial for models' good performance with shuffled text. We probe these language models for word order information and investigate what position embeddings learned from shuffled text encode, showing that these models retain information pertaining to the original, naturalistic word order. We show this is in part due to a subtlety in how shuffling is implemented in previous work - before rather than after subword segmentation. Surprisingly, we find even Language models trained on text shuffled after subword segmentation retain some semblance of information about word order because of the statistical dependencies between sentence length and unigram probabilities. Finally, we show that beyond GLUE, a variety of language understanding tasks do require word order information, often to an extent that cannot be learned through fine-tuning.

UR - http://www.scopus.com/inward/record.url?scp=85137729006&partnerID=8YFLogxK

U2 - 10.18653/v1/2022.acl-long.476

DO - 10.18653/v1/2022.acl-long.476

M3 - Article in proceedings

AN - SCOPUS:85137729006

SP - 6907

EP - 6919

BT - ACL 2022 - 60th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers)

A2 - Muresan, Smaranda

A2 - Nakov, Preslav

A2 - Villavicencio, Aline

PB - Association for Computational Linguistics (ACL)

Y2 - 22 May 2022 through 27 May 2022

ER -

ID: 341489512