Exploding gradient is a problem that occurs in deep neural networks when large error gradients accumulate during training, leading to extremely large updates to the model's weights. This can cause the model to become unstable and fail to converge, necessitating techniques like gradient clipping to mitigate the issue.