Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

How important is the Foundation models?
It may seem like a stupid question, but it has come up a lot in my conversation with AI Startups, which are so comfortable with businesses that were dismissed as “GPT wrap”, or companies that create interfaces on the top of the existing AI model like ChatGPT. In these days, startup teams focus on customizing AI models for specific tasks and interfaces, and see the foundation model as a product that can be swept and outdoors. That method was especially displayed Box Works of the last week Conference, which seemed completely dedicated to the user-faced software built at the top of the AI models.
This is a part of what is running, the pre-training scaling facilities-the primary process of teaching AI models using the Datasets, which has become the only domain-domain of the foundation models. This does not mean that AI has stopped progressing, but the primary benefits of the hypersisled foundational models have hit the reduction return and attention is focusing on the subsequent and reinforced education as a source of future progress. If you want to create a better AI coding equipment, it is better to work on subtle-surrender and interface design than spending more than a few billion dollars during the server during pre-training. As the success of anthropological clode code shows, foundation model companies are pretty good in these other fields – but it is not as sustainable as before.
In short, the competitive landscape of AI is changing in such a way that reduces the benefits of the largest AI labs. Instead of competition of a Almighty AGI that can match or overcome human skills in all cognitive works, the immediate future looks like a shake of isolated business: software development, enterprise data management, image production and more. With the exception of the first-muver facility, it is not clear that creating a foundation model gives you any benefit in those businesses. The worst thing is that the abundance of open-source options means that if the applicable competition is lost in the application level, they may not have any price leverage. It will turn companies such as OpenAI and ethnic products into low-marginal product business back-end suppliers-a founder kept it to me, “like selling coffee beans at Starbacks.”
It is hard to consider what dramatic changes for AI business. Throughout contemporary speed, AI’s success companies have been integral to the success of the building foundation models – especially, Open, ethnographic and Google. Being bullshit in AI means to believe that the converting effects of AI will turn them into important companies of the generation. We can argue about which company will come to the top, but it was clear that some Foundation Model Company is about to end with Kingdom keys.
At that time, there was a lot of reasons to think that it was true. Over the years, the foundation model development was the only AI business – and the speed of progress considered their leadership inaccessible. And the Silicon Valley was always the love of deep-root for the benefit of the platform. The idea was that, however, that the AI models finished the money, the share of the lion would return to the foundation model companies, who did the job that was the most difficult to make.
Last year that story made it more complicated. There are a lot of successful third -party AI services here, but they tend to use the foundation models interpretablely. For startups, it is no longer important whether their product is sitting at the top of GPT -5, Claud or Jemini, and they hope to be able to switch models in mid -release without the last users without targeting the difference. Foundation models have continued to make real progress, but it is no longer commendable for any organization to maintain sufficient benefits to dominate the industry.
TechCrunch event
San Francisco
|
October 27-29, 2025
We already have a lot of hints that the first-mover facility is not too much. Such as A16z’s zealous capitalist Martin Cassado indicated A recent podcastThe first lab was the first lab that had a coding model, as well as the generator models for images and videos – only three categories could be lost to the contestants. “As far as we can say, the technology stack for AI has no underlying rock,” said Cassado concluding.
Of course, we should not count the Foundation Model companies yet. They still have a lot of sustainable benefits, including brand recognition, infrastructure and unimaginable cash reserves. Openai’s consumer business can prove to be more difficult to make replicas than the coding business of the coding business, and other benefits may arise as the sector mature. Fasting speed of AI development, the current interest in subsequent training can easily be reversed in the next six months. The most uncertain among all, the racing towards the general detectives can pay to the new epoch of pharmaceuticals or material science, which makes our ideas about what makes AI models valuable.
In the meantime, the strategy for creating the Ever-Biser Foundation models looks much less applicable than last year-and Meta’s billion dollars spending Spring Spring has begun to look extremely risky.