SoftMax Activation Function
The softmax is very similar to the Sigmoid function in returning values between 0 and 1. The difference is that it computes this by each input to the sum of output their by returning probabilities.
The mathematically representation is: $$ softmax(x_i) = \frac {e^{x_i}}{\sum_{j=1}^{k} {e^{x_j}}} $$
Implementation in PyTorch
Below, we implement the softmax activation function in pytorch and visualize the output
import torch
# Initialize softmax
softmax = torch.nn.Softmax(dim=1)
x_values = torch.randn(1, 5)
y_values = softmax(x_values)
y_values, y_values.sum()
The output of softmax is:
Note that the sum of the output equals 1.