AdaBoost, short for Adaptive Boosting, is a powerful ensemble learning technique that improves the performance of machine learning models by combining multiple weak learners into a strong learner. It focuses on correcting the mistakes of previous models by giving more weight to difficult-to-classify instances, thereby enhancing the model's accuracy over iterations.