Table of Contents
Activation functions are essential components of neural networks. They determine the output of a neuron based on its input, enabling the network to learn complex patterns. This article explains how to calculate activation functions step-by-step.
Understanding Activation Functions
Activation functions transform the input signals into outputs that can be used by subsequent layers. Common functions include sigmoid, tanh, and ReLU. Each has unique properties that affect the learning process.
Calculating the Sigmoid Function
The sigmoid function is defined as f(x) = 1 / (1 + e-x). To calculate it:
- Input the value of x.
- Calculate the exponential of -x.
- Add 1 to this exponential.
- Divide 1 by the result to get the output.
Calculating the ReLU Function
The Rectified Linear Unit (ReLU) is simple: f(x) = max(0, x). To compute:
- Input the value of x.
- If x is greater than 0, output x.
- If x is less than or equal to 0, output 0.
Calculating the Tanh Function
The hyperbolic tangent function is f(x) = (ex – e-x) / (ex + e-x). To compute:
- Calculate ex and e-x.
- Subtract e-x from ex.
- Add ex and e-x.
- Divide the difference by the sum to get the output.