CHAOTIC STATES OF A MULTILAYER NEURAL NETWORK

Serhiy Sveleba, I. Katerynchuk, I. Kunyo, I. Karpa, O. Semotyuk, Ya. Shmygelsky, N. Sveleba, V. Kunyo

Abstract


The study of the influence of learning speed η on the learning process of a multilayer neural network was studied. The program for the multilayer neural network was written in Python. We determined the speed of learning at which the best learning is observed. To analyze the impact of learning speed on the learning process, a logistic function was used, which describes the doubling of frequency. It is shown that the learning error function is characterized by bifurcation processes that lead to a chaotic state at η> 0.8. The optimal value of the learning speed is determined, which determines the appearance of the process of doubling the number of local minima. Increasing the number of hidden layers and the number of neurons in each layer does not lead to a radical change in the diagram of the logistics function, and hence the optimal value of the learning speed. It is shown that the increase in the number of hidden layers, as well as the number of neurons in them, with increasing speed of learning is accompanied by the appearance of a chaotic state. The chaotic state is characterized by the lack of a learning process. Using the Fourier transform of the error function, the bifurcation process depending on the learning speed was investigated. On the basis of these researches the program for definition of size of optimum speed of training of a neural network is developed. The text of the article adds program codes for a multilayer neural network with the ability to build a branching diagram and a program to determine the presence or absence of the learning process, taking into account the value of optimal learning speed and error.

Key words: multilayer neural network, optimal learning speed, bifurcations, chaotic states.


References


[1] Руденко О. Г. Штучні нейронні мережі / О. Г. Руденко, Є. В. Бодянський. – Харків: Компанія СМІТ, 2006. – 404 с.

[2] Субботін С. О. Нейронні мережі: теорія та практика: навч. посіб. / С. О. Субботін. – Житомир: Вид. О.О. Євенок, 2020. – 184 с.

[3] Круглов В.В., Борисов В.В. Искусственные нейронные сети. Теорія и практика. – М.: Горячая линия - Телеком, 2001. – 382 с.

[4] Субботін С. О. Нейронні мережі: навчальний посібник / С. О. Субботін, А.О. Олійник; під заг. ред. проф. С.О. Субботіна. – Запоріжжя: ЗНТУ, 2014. – 132 с.

[5] Yurij Olenych, Sergiy Sveleba, Ivan Katerynchuk, Ivan Kunio, Ivan Karpa Features of deep studyneural network / [Електронний ресурс]. – Режим доступу: https://openreviewhub.org/lea/paper-2019/features-deep-study-neural-network#

[6] Информационная энтропия хаоса / Хабр (habr.com))

[7] Кузнецов А.П. Динамические системы и бифуркации – Саратов: ООО Издательский центр «Наука», 2015. – 168 с.

[8] Хазова Ю.А. Элементы теории бифуркаций. Часть 1. Динамические системы: учебно-методическое пособие / Ю.А. Хазова – ФГАОУ ВО «Крымский федеральный университет имени В. И. Вернадского». – Симферополь, 2019. – 55 с.




DOI: http://dx.doi.org/10.30970/eli.16.3

Refbacks

  • There are currently no refbacks.