Big Theta notation, denoted as Θ, is used in computer science to describe the asymptotic behavior of functions, providing a tight bound on the growth rate of an algorithm's running time or space requirements. It captures both the upper and lower bounds, indicating that a function grows at the same rate as another function within constant factors, making it a precise way to express algorithm efficiency.