OpenAI Claims GPT-4 Can Now Be Rebuilt with Only 5 to 10 People

A New Era in GPT Model Development
Streamlined Processes at OpenAI
Creating the GPT-4 model required an extensive team effort at OpenAI, comprising hundreds of people and consuming nearly all of the company’s resources. However, with insights gained from the development of GPT-4.5, OpenAI believes it can now rebuild GPT-4 with a significantly smaller team, potentially as few as five individuals.
In a recent podcast, OpenAI’s CEO, Sam Altman, posed an intriguing question to three engineers involved in the GPT-4.5 project: What is the smallest team that could retrain GPT-4 today? Alex Paino, who led the pre-training for GPT-4.5, suggested that this process could now take anywhere between five to ten people. Paino noted that the shorter team size stems from lessons learned during the training of GPT-4o, a model fine-tuned from the original GPT-4 using methodologies developed in the GPT-4.5 research.
The Value of Experience
One of the engineers, Daniel Selsam, communicated that having prior knowledge of what others have accomplished greatly simplifies the process of model rebuilding. “Knowing that something is possible serves as a sort of cheat code,” he remarked. This learning curve ultimately reduces the number of resources needed to create advanced models like GPT-4.
Advances in AI Model Development
In February 2023, OpenAI introduced GPT-4.5, labeling it the company’s most robust and capable model. Altman described it as the first AI model that genuinely feels like conversing with a thoughtful person. Paino emphasized that GPT-4.5 aims to be "ten times smarter" than its predecessor, showcasing massive scalability in AI model development.
Moving Beyond Computational Constraints
Altman also mentioned a crucial shift in their capabilities: OpenAI is no longer constrained by computing power when developing superior models. Traditionally, many AI firms faced challenges due to limited computing resources. However, given the significant investments being funneled into AI infrastructure—estimated at $320 billion this year alone—this is no longer an issue for OpenAI.
Major tech giants, including Microsoft, Amazon, Google, and Meta, are investing heavily into expanding their AI capabilities. OpenAI itself secured substantial funding, building on a record-breaking tech funding round, which included a $30 billion investment from SoftBank and other backers, achieving a company valuation of $300 billion. Such financial support is set to bolster OpenAI’s computational capacity even further.
Future Needs in AI Development
Nvidia’s CEO, Jensen Huang, remarked that the demand for AI computing resources will only increase, as future models may require exponentially more computing power. While current GPT models efficiently handle information, there comes a point where data limitations create bottlenecks in model performance. Selsam explained that as computational power escalates, managing and presenting data effectively becomes critical.
To achieve drastic improvements and maximize the utility of existing data, innovation in algorithms is essential. As Selsam pointed out, the real challenge lies in harnessing greater value from the same data set, meaning that ongoing development will require not just more data, but smarter algorithms to work with that data efficiently.
The advancements in AI, as demonstrated by OpenAI’s experiences with GPT-4.5, underscore a transformative shift in how models are developed and retrained, highlighting a future filled with potential efficiencies and new breakthroughs in artificial intelligence.