Concept
Adam Optimizer 0
The Adam Optimizer is an adaptive learning rate optimization algorithm designed for training deep neural networks, combining the advantages of two other extensions of stochastic gradient descent, namely AdaGrad and RMSProp. It computes individual adaptive learning rates for different parameters from estimates of first and second moments of the gradients, making it particularly effective for problems with sparse gradients and non-stationary objectives.
Relevant Degrees