Difference between Leaky ReLU and ReLU activation function?

Leaky ReLU is a type of activation function that helps to prevent the function from becoming saturated at 0. It has a small slope instead of the standard ReLU which has an infinite slope Leaky ReLU is a modification of the ReLU activation function. It has the same form as the ReLU, but it will…

Read More