Analogy Training Multilingual Encoderss

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Analogy Training Multilingual Encoderss. / Garneau, Nicolas ; lwp876, lwp876; Sandholm, Anders; Ruder, Sebastian ; Vulić, Ivan; Søgaard, Anders.

Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence. AAAI Press, 2021. p. 12884-12892. (Proceedings of the International Joint Conference on Artificial Intelligence; No. 14, Vol. 35).

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Garneau, N, lwp876, L, Sandholm, A, Ruder, S, Vulić, I & Søgaard, A 2021, Analogy Training Multilingual Encoderss. in Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence. AAAI Press, Proceedings of the International Joint Conference on Artificial Intelligence, no. 14, vol. 35, pp. 12884-12892., 35th AAAI Conference on Artificial Intelligence, Virtual, 02/02/2021.

APA

Garneau, N., lwp876, L., Sandholm, A., Ruder, S., Vulić, I., & Søgaard, A. (2021). Analogy Training Multilingual Encoderss. In Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence (pp. 12884-12892.). AAAI Press. Proceedings of the International Joint Conference on Artificial Intelligence Vol. 35 No. 14

Vancouver

Garneau N, lwp876 L, Sandholm A, Ruder S, Vulić I, Søgaard A. Analogy Training Multilingual Encoderss. In Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence. AAAI Press. 2021. p. 12884-12892. (Proceedings of the International Joint Conference on Artificial Intelligence; No. 14, Vol. 35).

Author

Garneau, Nicolas ; lwp876, lwp876 ; Sandholm, Anders ; Ruder, Sebastian ; Vulić, Ivan ; Søgaard, Anders. / Analogy Training Multilingual Encoderss. Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence. AAAI Press, 2021. pp. 12884-12892. (Proceedings of the International Joint Conference on Artificial Intelligence; No. 14, Vol. 35).

Bibtex

@inproceedings{8a20b420203a4b8c869b87a9ba5cc621,
title = "Analogy Training Multilingual Encoderss",
abstract = "Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.",
author = "Nicolas Garneau and lwp876 lwp876 and Anders Sandholm and Sebastian Ruder and Ivan Vuli{\'c} and Anders S{\o}gaard",
year = "2021",
language = "English",
series = "Proceedings of the International Joint Conference on Artificial Intelligence",
publisher = "AAAI Press",
number = "14",
pages = "12884--12892.",
booktitle = "Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence",
note = "null ; Conference date: 02-02-2021 Through 09-02-2021",

}

RIS

TY - GEN

T1 - Analogy Training Multilingual Encoderss

AU - Garneau, Nicolas

AU - lwp876, lwp876

AU - Sandholm, Anders

AU - Ruder, Sebastian

AU - Vulić, Ivan

AU - Søgaard, Anders

PY - 2021

Y1 - 2021

N2 - Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.

AB - Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.

M3 - Article in proceedings

T3 - Proceedings of the International Joint Conference on Artificial Intelligence

SP - 12884-12892.

BT - Proceedings of the AAAI-21 International Joint Conference on Artificial Intelligence

PB - AAAI Press

Y2 - 2 February 2021 through 9 February 2021

ER -

ID: 300671526