Concept
ReLU (Rectified Linear Unit) 0
ReLU, or Rectified Linear Unit, is an activation function used in neural networks that outputs the input directly if it is positive, otherwise, it outputs zero. It is favored for its simplicity and effectiveness in mitigating the vanishing gradient problem, thus enabling deeper networks to be trained efficiently.
Relevant Degrees