Leaky Parametric ReLU Activation Function
The Leaky ReLU is a modified Rectified Linear Unit activation function to account for zero outputs of the ReLU activation. In it, we learn a parameter $\alpha$ a leaky coefficient for negative values
The mathematically representation is: $$ \begin{align} f(x) = \left\{ \begin{array}{cc} \alpha x & x < 0 \\ x & x \geq 0 \\ \end{array} \right. \end{align} $$
Implementation in PyTorch
Below, we implement the leaky or parametric ReLU activation function in pytorch and visualize the output
import torch
import matplotlib.pyplot as plt
# Relu with one parameter
p_relu = torch.nn.PReLU(num_parameters=1)
x_inputs = torch.arange(-10., 10., 1)
y_outputs = p_relu(x_inputs)
plt.figure(figsize=(9,6))
plt.plot(x_inputs, y_outputs)
plt.title("Leaky ReLU Function, x[-10, 10]")
plt.show()