Activation function: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 21: Line 21:


===Tanh===
===Tanh===
The tanh function is a hyperbolic tangent function that maps any input value to an integer between -1 and 1. This hyperbolic tangent function can be useful in [[regression]] problems where the output may take on an array of values.
The [[tanh]] function is a hyperbolic tangent function that maps any input value to an integer between -1 and 1. This hyperbolic tangent function can be useful in [[regression]] problems where the output may take on an array of values.


===ReLU===
===ReLU===
The rectified linear unit (ReLU) function is a popular activation function that maps any input value to either 0, or the value itself. This makes the ReLU function ideal for deep neural networks, helping prevent the vanishing gradient problem.
The [[rectified linear unit]] (ReLU) function is a popular activation function that maps any input value to either 0, or the value itself. This makes the ReLU function ideal for deep neural networks, helping prevent the vanishing gradient problem.


===Softmax===
===Softmax===
The softmax function is a popular activation function used in the [[output layer]] of neural networks for [[multi-class classification]] problems. This activation function maps input into an interval probability distribution over output [[classes]].
The [[softmax]] function is a popular activation function used in the [[output layer]] of neural networks for [[multi-class classification]] problems. This activation function maps input into an interval probability distribution over output [[classes]].


==Explore Like I'm 5 (ELI5)==
==Explore Like I'm 5 (ELI5)==
Activation functions are like special glasses that help computers see better. They alter pictures, sounds, or other items by altering their hue or brightness; this makes it easier for the computer to differentiate what the picture or sound is and how best to process it. Different glasses are used for various tasks like seeing colors or finding loudest noise.
Activation functions are like special glasses that help computers see better. They alter pictures, sounds, or other items by altering their hue or brightness; this makes it easier for the computer to differentiate what the picture or sound is and how best to process it. Different glasses are used for various tasks like seeing colors or finding loudest noise.