Luca Parisihyper-sinh: An Accurate and Reliable Activation Function from Shallow to Deep LearningDespite recent developments of activation functions for Machine Learning (ML)-based classifiers, such as the m-arcsinh (Parisi, 2020) for…5 min read·May 31, 2021----
Luca ParisiQReLU and m-QReLU activation functions: Solving the ‘dying ReLU’ problem for Deep Learning at ScaleThe Rectified Linear Unit (ReLU) activation function (AF) has been extensively applied in deep neural networks for image classification, in…3 min read·May 9, 2021--1--1