Adaptive step size refers to a method in numerical optimization and machine learning where the step size or learning rate is dynamically adjusted during the training process to improve convergence and avoid overshooting. This approach helps in efficiently navigating the error surface by taking larger steps in flat regions and smaller steps in steep regions, thereby enhancing the stability and speed of the optimization process.