Neural Speed Reading Audited

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Neural Speed Reading Audited. / Søgaard, Anders.

Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020. p. 148–153.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Søgaard, A 2020, Neural Speed Reading Audited. in Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, pp. 148–153, The 2020 Conference on Empirical Methods in Natural Language Processing, 16/11/2020. https://doi.org/10.18653/v1/2020.findings-emnlp.14

APA

Søgaard, A. (2020). Neural Speed Reading Audited. In Findings of the Association for Computational Linguistics: EMNLP 2020 (pp. 148–153). Association for Computational Linguistics. https://doi.org/10.18653/v1/2020.findings-emnlp.14

Vancouver

Søgaard A. Neural Speed Reading Audited. In Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics. 2020. p. 148–153 https://doi.org/10.18653/v1/2020.findings-emnlp.14

Author

Søgaard, Anders. / Neural Speed Reading Audited. Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020. pp. 148–153

Bibtex

@inproceedings{280239088a954fb885a991aa56555304,
title = "Neural Speed Reading Audited",
abstract = "Several approaches to neural speed reading have been presented at major NLP and machine learning conferences in 2017–20; i.e., “human-inspired” recurrent network architectures that learn to “read” text faster by skipping irrelevant words, typically optimizing the joint objective of minimizing classification error rate and FLOPs used at inference time. This paper reflects on the meaningfulness of the speed reading task, showing that (a) better and faster approaches to, say, document classification, already exist, which also learn to ignore part of the input (I give an example with 7% error reduction and a 136x speed-up over the state of the art in neural speed reading); and that (b) any claims that neural speed reading is “human-inspired”, are ill-founded.",
author = "Anders S{\o}gaard",
year = "2020",
doi = "10.18653/v1/2020.findings-emnlp.14",
language = "English",
pages = "148–153",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
publisher = "Association for Computational Linguistics",
note = "The 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 ; Conference date: 16-11-2020 Through 20-11-2020",
url = "http://2020.emnlp.org",

}

RIS

TY - GEN

T1 - Neural Speed Reading Audited

AU - Søgaard, Anders

PY - 2020

Y1 - 2020

N2 - Several approaches to neural speed reading have been presented at major NLP and machine learning conferences in 2017–20; i.e., “human-inspired” recurrent network architectures that learn to “read” text faster by skipping irrelevant words, typically optimizing the joint objective of minimizing classification error rate and FLOPs used at inference time. This paper reflects on the meaningfulness of the speed reading task, showing that (a) better and faster approaches to, say, document classification, already exist, which also learn to ignore part of the input (I give an example with 7% error reduction and a 136x speed-up over the state of the art in neural speed reading); and that (b) any claims that neural speed reading is “human-inspired”, are ill-founded.

AB - Several approaches to neural speed reading have been presented at major NLP and machine learning conferences in 2017–20; i.e., “human-inspired” recurrent network architectures that learn to “read” text faster by skipping irrelevant words, typically optimizing the joint objective of minimizing classification error rate and FLOPs used at inference time. This paper reflects on the meaningfulness of the speed reading task, showing that (a) better and faster approaches to, say, document classification, already exist, which also learn to ignore part of the input (I give an example with 7% error reduction and a 136x speed-up over the state of the art in neural speed reading); and that (b) any claims that neural speed reading is “human-inspired”, are ill-founded.

U2 - 10.18653/v1/2020.findings-emnlp.14

DO - 10.18653/v1/2020.findings-emnlp.14

M3 - Article in proceedings

SP - 148

EP - 153

BT - Findings of the Association for Computational Linguistics: EMNLP 2020

PB - Association for Computational Linguistics

T2 - The 2020 Conference on Empirical Methods in Natural Language Processing

Y2 - 16 November 2020 through 20 November 2020

ER -

ID: 258378396