Publication: The Application of Transformer Model Architecture for the Dependency Parsing Task
Дата
2021
Авторы
Chernyshov, A.
Klimov, V.
Balandina, A.
Shchukin, B.
Journal Title
Journal ISSN
Volume Title
Издатель
Аннотация
© 2020 Elsevier B.V.. All rights reserved.In this paper, authors discover the advantages of the attention-based neural network application to the natural language dependency-parsing task. The authors explain the architecture and show the results of comparison between attention-based neural network and long-short memory neural networks in relation to the dependency-parsing task.
Описание
Ключевые слова
Цитирование
The Application of Transformer Model Architecture for the Dependency Parsing Task / Chernyshov, A. [et al.] // Procedia Computer Science. - 2021. - 190. - P. 142-145. - 10.1016/j.procs.2021.06.018