Concept
ReLU 0
ReLU, or Rectified Linear Unit, is an activation function used in neural networks that outputs the input directly if it is positive, otherwise, it outputs zero. It addresses the vanishing gradient problem and accelerates the convergence of stochastic gradient descent compared to sigmoid and tanh functions.
Relevant Degrees