Publication:
Methods for Speeding Up the Retraining of Neural Networks

dc.contributor.authorVarykhanov, S. S.
dc.contributor.authorSinelnikov, D. M.
dc.contributor.authorOdintsev, V. V.
dc.contributor.authorRovnyagin, M. M.
dc.contributor.authorMingazhitdinova, E. F.
dc.contributor.authorСинельников, Дмитрий Михайлович
dc.contributor.authorРовнягин, Михаил Михайлович
dc.date.accessioned2024-12-25T13:47:51Z
dc.date.available2024-12-25T13:47:51Z
dc.date.issued2022
dc.description.abstract© 2022 IEEE.Nowadays, machine learning is widespread and is becoming more complex. Developing and debugging neural networks is becoming more and more time-consuming. Distributed solutions are often used to speed up the learning process. But these solutions do not solve the problem of retraining model from zero if the learning fails. This paper presents a new approach to training models on a large datasets, which can save time and resources during the development. This approach is splitting the model's learning process into separate layers. Each of these layers can be modified and reused for the next layers. The implementation of this approach is based on transfer learning and distributed machine learning techniques. To create reusable network layers, it is proposed to use the methods of automating code parallelization for hybrid computing systems described in the article. These methods include: tracking the readiness and dependencies in the data, speculative execution at the kernel level, creating a DSL
dc.format.extentС. 478-481
dc.identifier.citationMethods for Speeding Up the Retraining of Neural Networks / Varykhanov, S.S. [et al.] // Proceedings of the 2022 Conference of Russian Young Researchers in Electrical and Electronic Engineering, ElConRus 2022. - 2022. - P. 478-481. - 10.1109/ElConRus54750.2022.9755557
dc.identifier.doi10.1109/ElConRus54750.2022.9755557
dc.identifier.urihttps://www.doi.org/10.1109/ElConRus54750.2022.9755557
dc.identifier.urihttps://www.scopus.com/record/display.uri?eid=2-s2.0-85129552110&origin=resultslist
dc.identifier.urihttps://openrepository.mephi.ru/handle/123456789/27904
dc.relation.ispartofProceedings of the 2022 Conference of Russian Young Researchers in Electrical and Electronic Engineering, ElConRus 2022
dc.titleMethods for Speeding Up the Retraining of Neural Networks
dc.typeConference Paper
dspace.entity.typePublication
relation.isAuthorOfPublication63daac74-f574-41a3-9976-ec77973171dd
relation.isAuthorOfPublicationbea7e106-69a4-4bc0-b6f5-8d54ccf94616
relation.isAuthorOfPublication.latestForDiscovery63daac74-f574-41a3-9976-ec77973171dd
relation.isOrgUnitOfPublication010157d0-1f75-46b2-ab5b-712e3424b4f5
relation.isOrgUnitOfPublication.latestForDiscovery010157d0-1f75-46b2-ab5b-712e3424b4f5
Файлы
Коллекции