Cross-entropy is a measure from the field of information theory that quantifies the difference between two probability distributions. It is widely used as a loss function in machine learning, particularly for classification tasks, to optimize model predictions towards actual outcomes.