Generative Pre-trained Transformers (GPT) are a class of language models that leverage unsupervised learning on large text corpora to generate coherent and contextually relevant text. They utilize a transformer architecture to capture long-range dependencies and fine-tune on specific tasks to enhance performance in natural language understanding and generation.