VOL. 21 | 2020 Comparative Evaluation of Various Activation Functions in the Recurrent Neurons of the LRPNN
Nikolay Dukov, Todor Ganchev

Editor(s) Ivaïlo M. Mladenov, Vladimir Pulov, Akira Yoshioka

Geom. Integrability & Quantization, 2020: 127-137 (2020) DOI: 10.7546/giq-21-2020-127-137

Abstract

In the present work, we investigate the behavior of the Locally Recurrent Probabilistic Neural Network (LRPNN) with different activation functions in the recurrent layer neurons. Specifically, we evaluate the performance of the modified activation function proposed here, which belongs to the family of Rectified Linear Units (ReLU), and compare it with other ReLU-based functions, the traditional sigmoid activation function, as well as with the Swish and E-Swish activation functions. Furthermore, we investigate the efficiency of a training procedure which simultaneously adjusts the spread factor sigma and the weights in the recurrent layer of the LRPNN. This training helps for coping with practical tasks, such as the recognition of Parkinson condition from speech signals, which operate under limited amount of training data.

Information

Published: 1 January 2020
First available in Project Euclid: 14 October 2020

Digital Object Identifier: 10.7546/giq-21-2020-127-137

Rights: Copyright © 2020 Institute of Biophysics and Biomedical Engineering, Bulgarian Academy of Sciences

PROCEEDINGS ARTICLE
11 PAGES


Back to Top