Linear time refers to a computational complexity where the time taken to complete an algorithm increases linearly with the size of the input data. This implies that as the input grows, the performance scales predictably, making it efficient for handling large datasets compared to higher complexity algorithms.