As Google's full-stack AI capability chain begins to emerge, the TPU, as a core hardware component, has once again become the focus of the market.
On Monday, Eastern Time, Broadcom , a key chip partner of Google, saw its stock price rise, driven by strong performance in Google's stock. Broadcom's stock price surged. At the close, Broadcom was up 11.1%, marking its best single-day performance since April 9th.

In terms of news, TrendForce points out that Google is collaborating with Broadcom to develop the TPU v7p (Ironwood), a platform optimized for training, planned for release in 2026, replacing the TPU v6e (Trilium). TrendForce predicts that Google's TPU shipments will continue to rank high in communication services. A leading position among providers (CSPs) and a projected annual growth rate of over 40% by 2026.
Currently, the value of Google's TPU chips is being explored by various parties. Today, Meta and Google were reportedly in talks regarding a TPU chip collaboration. Meta plans to lease TPU computing power through Google Cloud starting in 2026 and integrate it into its own data centers by 2027. The deployment of Google's TPU could be worth billions of dollars.
Western Securities It is pointed out that Google primarily uses its self-developed TPUs on the chip side, and has formed a mature ASIC system integrating training and inference. The Gemini 3 model was trained on Google's TPU cluster. The officially released next-generation Ironwood (TPU v7) is specifically designed for inference models, demonstrating its engineering advantages in large-scale, low-power inference.
In terms of performance, Google says that the new generation of Ironwood TPUs can connect up to 9,216 chips in a single cluster, thereby eliminating "data bottlenecks in the most complex models." With Ironwood, developers can also reliably and easily utilize the combined computing power of tens of thousands of Ironwood TPUs using Google's own Pathways software stack.
The success of TPUs has further influenced market perceptions of ASICs . Dan Ives of Wedbush Securities stated that the market is "rediscovering" the huge potential of ASIC chips. Google has largely spearheaded this trend, and its TPU is one of the most mature ASIC chips on the market.
Currently, Silicon Valley giants are all embarking on the path of "shovel manufacturing." Tesla Elon Musk recently announced that he has assembled a top-tier chip development team and deployed millions of self-developed AI chips in vehicle control systems and data centers. He stated that he would be "deeply involved" in Tesla 's AI chip design, and that his goal was to "put a new chip into mass production every year."
According to a Trendforce report, large cloud service providers (CSPs) worldwide are expanding their purchases of Nvidia. GPU rack-mount solutions, data center expansion and other infrastructure development, and accelerated self-developed AI ASICs are expected to drive growth for Google and Amazon by 2025. Cloud technology, Meta, Microsoft Oracle bone script The combined capital expenditures of the eight major CSPs have exceeded $420 billion. (Guojin Securities) It is predicted that the number of ASICs from Google, Amazon , Meta, OpenAI, and Microsoft may experience explosive growth between 2026 and 2027.
For investors and cloud vendors, ASIC chips inherently offer a cost advantage. Analysts believe that while the computing power of a single AI ASIC card is currently lower than that of comparable GPU chips, its lower cost allows it to demonstrate higher cost-effectiveness at commonly used inference precision levels, along with lower power consumption. Furthermore, because ASICs are designed specifically for particular tasks, their computing power utilization may be higher.
From an investment perspective, Huajin Securities It is pointed out that in artificial intelligence represented by large models such as GPT and Gemini... Driven by this trend, the scale of training data and parameters is growing rapidly. With the optimization of deep learning algorithms and the maturation of models, the computational demands of inference-based AI agents for training and inference are increasing exponentially, leading to a greater demand for customized ASICs. We recommend paying attention to investment opportunities arising from customized ASICs, including computing chips and PCBs. And light module.

(Source: Science and Technology Innovation Board Daily)