Are All Good Word Vector Spaces Isomorphic?

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Are All Good Word Vector Spaces Isomorphic? / Vulic, Ivan; Ruder, Sebastian ; Søgaard, Anders.

Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. p. 3178–3192.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Vulic, I, Ruder, S & Søgaard, A 2020, Are All Good Word Vector Spaces Isomorphic? in Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, pp. 3178–3192, The 2020 Conference on Empirical Methods in Natural Language Processing, 16/11/2020. https://doi.org/10.18653/v1/2020.emnlp-main.257

APA

Vulic, I., Ruder, S., & Søgaard, A. (2020). Are All Good Word Vector Spaces Isomorphic? In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) (pp. 3178–3192). Association for Computational Linguistics. https://doi.org/10.18653/v1/2020.emnlp-main.257

Vancouver

Vulic I, Ruder S, Søgaard A. Are All Good Word Vector Spaces Isomorphic? In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics. 2020. p. 3178–3192 https://doi.org/10.18653/v1/2020.emnlp-main.257

Author

Vulic, Ivan ; Ruder, Sebastian ; Søgaard, Anders. / Are All Good Word Vector Spaces Isomorphic?. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. pp. 3178–3192

Bibtex

@inproceedings{c8be969880684d2ca2919e300e61af21,
title = "Are All Good Word Vector Spaces Isomorphic?",
abstract = "Existing algorithms for aligning cross-lingual word vector spaces assume that vector spaces are approximately isomorphic. As a result, they perform poorly or fail completely on non-isomorphic spaces. Such non-isomorphism has been hypothesised to result from typological differences between languages. In this work, we ask whether non-isomorphism is also crucially a sign of degenerate word vector spaces. We present a series of experiments across diverse languages which show that variance in performance across language pairs is not only due to typological differences, but can mostly be attributed to the size of the monolingual resources available, and to the properties and duration of monolingual training (e.g. “under-training”).",
author = "Ivan Vulic and Sebastian Ruder and Anders S{\o}gaard",
year = "2020",
doi = "10.18653/v1/2020.emnlp-main.257",
language = "English",
pages = "3178–3192",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
publisher = "Association for Computational Linguistics",
note = "The 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 ; Conference date: 16-11-2020 Through 20-11-2020",
url = "http://2020.emnlp.org",

}

RIS

TY - GEN

T1 - Are All Good Word Vector Spaces Isomorphic?

AU - Vulic, Ivan

AU - Ruder, Sebastian

AU - Søgaard, Anders

PY - 2020

Y1 - 2020

N2 - Existing algorithms for aligning cross-lingual word vector spaces assume that vector spaces are approximately isomorphic. As a result, they perform poorly or fail completely on non-isomorphic spaces. Such non-isomorphism has been hypothesised to result from typological differences between languages. In this work, we ask whether non-isomorphism is also crucially a sign of degenerate word vector spaces. We present a series of experiments across diverse languages which show that variance in performance across language pairs is not only due to typological differences, but can mostly be attributed to the size of the monolingual resources available, and to the properties and duration of monolingual training (e.g. “under-training”).

AB - Existing algorithms for aligning cross-lingual word vector spaces assume that vector spaces are approximately isomorphic. As a result, they perform poorly or fail completely on non-isomorphic spaces. Such non-isomorphism has been hypothesised to result from typological differences between languages. In this work, we ask whether non-isomorphism is also crucially a sign of degenerate word vector spaces. We present a series of experiments across diverse languages which show that variance in performance across language pairs is not only due to typological differences, but can mostly be attributed to the size of the monolingual resources available, and to the properties and duration of monolingual training (e.g. “under-training”).

U2 - 10.18653/v1/2020.emnlp-main.257

DO - 10.18653/v1/2020.emnlp-main.257

M3 - Article in proceedings

SP - 3178

EP - 3192

BT - Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

PB - Association for Computational Linguistics

T2 - The 2020 Conference on Empirical Methods in Natural Language Processing

Y2 - 16 November 2020 through 20 November 2020

ER -

ID: 258388356