Linear time complexity, denoted as O(n), describes an algorithm whose performance grows linearly with the size of the input data. This implies that the time taken for execution increases directly in proportion to the number of elements processed, making it efficient for operations where each element needs to be processed once.