• Bookmarks

    Bookmarks

  • Concepts

    Concepts

  • Activity

    Activity

  • Courses

    Courses


Concept
Step size, also known as learning rate, is a crucial hyperparameter in optimization algorithms that determines the size of the steps taken towards a minimum of a loss function. Choosing an appropriate Step size is essential for ensuring convergence and stability of the learning process, as too large a Step size can lead to overshooting, while too small a Step size can result in slow convergence.
History Empty State Icon

Log in to see lessons

3