GPT-1 (Generative Pre-trained Transformer 1) was the first generative pre-trained transformer model developed by OpenAI. It demonstrated the effectiveness of pre-training on a large corpus of text followed by fine-tuning for specific tasks.
Was it the GBT1? And I think all these things were important, but the closer you get to the surface, the more you realize it's just been one these small architectural changes, none of which individually was especially significant, but more overwhelmingly than that trend is just that we have been throwing astoundingly more compute into training these systems every single year.
""The speaker mentions GPT-1 (as GBT1) in the context of the history of AI research, discussing its role as one of the important milestones in the development of AI."