Abstract
In the present work, we investigate the behavior of the Locally Recurrent Probabilistic Neural Network (LRPNN) with different activation functions in the recurrent layer neurons. Specifically, we evaluate the performance of the modified activation function proposed here, which belongs to the family of Rectified Linear Units (ReLU), and compare it with other ReLU-based functions, the traditional sigmoid activation function, as well as with the Swish and E-Swish activation functions. Furthermore, we investigate the efficiency of a training procedure which simultaneously adjusts the spread factor sigma and the weights in the recurrent layer of the LRPNN. This training helps for coping with practical tasks, such as the recognition of Parkinson condition from speech signals, which operate under limited amount of training data.
Information
Digital Object Identifier: 10.7546/giq-21-2020-127-137