Comparative Study of Convolution Neural Network’s Relu and Leaky-Relu Activation Functions
Comparative Study of Convolution Neural Network’s Relu and Leaky-Relu Activation Functions
A. Dubey,Vanita Jain
2019 · DOI: 10.1007/978-981-13-6772-4_76
Lecture Notes in Electrical Engineering · 218 Citations
TLDR
This paper has used rectified linear unit (Relu) and Leaky-Relu activation for inner CNN layer and softmax activation function for output layer to analyze its effect on MNIST dataset.
