Concept
ReLU Activation 0
ReLU (Rectified Linear Unit) is an activation function used in neural networks that outputs the input directly if it is positive, otherwise, it outputs zero, introducing non-linearity to the model while maintaining computational efficiency. It helps mitigate the vanishing gradient problem and is widely used in deep learning architectures due to its simplicity and effectiveness.
Relevant Degrees