Efficient Document Embeddings via Self-Contrastive Bregman Divergence Learning

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Dokumenter

  • Fulltext

    Forlagets udgivne version, 378 KB, PDF-dokument

Learning quality document embeddings is a fundamental problem in natural language processing (NLP), information retrieval (IR), recommendation systems, and search engines. Despite recent advances in the development of transformer-based models that produce sentence embeddings with self-contrastive learning, the encoding of long documents (Ks of words) is still challenging with respect to both efficiency and quality considerations. Therefore, we train Longfomer-based document encoders using a state-of-the-art unsupervised contrastive learning method (SimCSE). Further on, we complement the baseline method - siamese neural network- with additional convex neural networks based on functional Bregman divergence aiming to enhance the quality of the output document representations. We show that overall the combination of a self-contrastive siamese network and our proposed neural Bregman network outperforms the baselines in two linear classification settings on three long document topic classification tasks from the legal and biomedical domains.

OriginalsprogEngelsk
TitelFindings of the Association for Computational Linguistics, ACL 2023
ForlagAssociation for Computational Linguistics (ACL)
Publikationsdato2023
Sider12181-12190
ISBN (Elektronisk)9781959429623
DOI
StatusUdgivet - 2023
Begivenhed61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, Canada
Varighed: 9 jul. 202314 jul. 2023

Konference

Konference61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
LandCanada
ByToronto
Periode09/07/202314/07/2023
SponsorBloomberg Engineering, et al., Google Research, Liveperson, Meta, Microsoft

Bibliografisk note

Funding Information:
Mina Rezai and Bernd Bisch were supported by the Bavarian Ministry of Economic Affairs, Regional Development and Energy through the Center for Analytics – Data – Applications (ADA-Center) within the framework of BAYERN DIGITAL II (20-3410-2-9-8).M. R. and B. B. were supported by the German Federal Ministry of Education and Research (BMBF) Munich Center for Machine Learning (MCML). This work was also partly funded by the Innovation Fund Denmark (IFD).2

Publisher Copyright:
© 2023 Association for Computational Linguistics.

ID: 373548719