Multiple artificial intelligence Industry insiders predict that the future of AI will shift from large and expensive models like ChatGPT to smaller, specialized AI agents. These systems are designed to handle specific tasks, have lower development costs, and can even run on laptops, thus eliminating the need for large data centers. .
According to a recent HSBC analysis, AI research company OpenAI currently claims annual revenue of $20 billion and plans to invest $1.4 trillion in building new data centers . However, even if OpenAI's revenue surpasses $200 billion by 2030, the company will still need to raise an additional $207 billion to maintain operations.
At a recent internet summit in Lisbon, more than a dozen AI industry professionals described an alternative future for AI.
They believe that the future will be dominated by smaller AI systems, which typically operate around AI "agents" that perform specific tasks, thus eliminating the need for massive language models like OpenAI, Google Gemini, or Anthropic Claude.
“Their valuations are based on the assumption that ‘bigger is better,’ but that’s not necessarily true,” said Babak Hodjat, chief AI officer at Cognizant.
“We do use large language models, but not the biggest ones. A large model is sufficient as long as it can follow instructions well in a specific domain, use tools, and communicate with other agents,” Hodjat said. “As long as a certain threshold is reached, the scale is sufficient.”
For example, the model launched by the Chinese AI company DeepSeek in January this year cost only a few million dollars to develop, which triggered a sell-off in tech stocks.
Hodjat stated that DeepSeek models use far fewer parameters per request than ChatGPT, yet their functionality is comparable. Once the models are scaled down to a certain size, they no longer require a data center and can run directly on a MacBook.
“That’s where the difference lies, and that’s the trend,” Hodjat said.
Many companies are focusing on AI agents or AI applications, believing that users will want to solve specific problems with dedicated apps.
Superhuman (formerly Grammarly) CEO Shishir Mehrotra stated that they operate an "AI agent app store" that can be embedded in browsers or run in thousands of Grammarly-licensed apps.
Mozilla employs a similar strategy for Firefox. CEO Laura Chambers stated, "We have several AI features, such as 'shake to summarize content,' smart tagging on mobile, link previews, and translation. All of these run locally; the data doesn't leave your device or get shared with the model or the large language model. We also have a sidebar where you can choose the model you want to use."
Ami Badani, head of strategy and chief marketing officer at chip company ARM, revealed that the company is adopting a model-agnostic strategy.
Badani said, "We create custom extension modules on top of the large model for specific purposes because different companies often have very different needs for the model."
This approach—comprised of multiple highly focused AI agents operating like independent businesses—is distinctly different from large, general-purpose AI platforms.
This model is attracting huge investments. IBM Ventures has invested in many seemingly insignificant AI startups that fill specific business needs.
One company, Not Diamond, noted that 85% of businesses using AI employ multiple AI models. Different models excel at different tasks, making it crucial to select the best model for each task.
(Article source: CLS)