Share this
Is Nvidia panicking? A graphic explaining what Google TPU is.

Is Nvidia panicking? A graphic explaining what Google TPU is.

2026-01-15 11:56:30 · · #1

Recently, Nvidia Google's stock price has fallen continuously, reducing its total market capitalization from a high of $5 trillion to $4.38 trillion. Meanwhile, Google's stock price has risen steadily, and its market capitalization is about to surpass $4 trillion.

Nvidia 's decline stems from two main factors: firstly, investor concerns about an AI spending bubble and questions about Nvidia 's "circular investment" in startups like OpenAI, which are also its clients; and secondly, market concerns about Google and other tech giants accelerating the development of their own chips, raising concerns about their AI chip offerings. The monopoly position may be loosened.

Google's stock price increase is due to two factors: the impressive performance of its new models, Gemini 3 Pro and Nano Banana Pro; and the potential for its self-developed TPU chip (the chip used in Gemini 3 Pro) to compete with Nvidia's market share.

A recent report indicates that Google is currently marketing its self-developed TPUs to customers, with Meta being one potential client. Meta is internally discussing investing billions of dollars to integrate Google's TPUs into its data centers starting in 2027. Internally, the company also plans to lease TPU computing power from Google Cloud as early as next year. A Google Cloud executive revealed that expanding TPU market adoption is expected to help the company capture 10% of Nvidia's annual revenue share.

Google's TPU stands for Tensor Processing Unit, which is a type of NPU (Neural Processing Unit). In China, Alibaba's PPU and Huawei's Da Vinci NPU also belong to this category. They are all dedicated application-specific processors (ASICs), which are different from general-purpose processors such as CPUs and GPUs.

In terms of core roles, CPUs excel at general-purpose computing, like a company's CEO, making complex decisions and managing the overall picture; GPUs excel at massively parallel computing, like all the company's employees, handling a large number of simple, repetitive tasks simultaneously; and NPUs excel at accelerating specific neural networks, like domain experts, efficiently solving a particular type of problem. Specifically, Google's TPU is equipped with customized features such as a matrix multiplication unit (MXU) and a dedicated interconnect network topology, enabling it to efficiently accelerate AI training and inference.

Google has been developing TPUs for over a decade, but only began making them available to a select group of customers in the cloud in 2018. Now, however, Google has taken a crucial step: recommending that some customers deploy TPUs locally in their own data centers .


(Article source: Eastmoney) Research Center

Read next

Silicon Valley AI "earthquake": Starting from the bond market?

Investors have recently been selling off bonds of U.S. tech giants, indicating a lack of confidence in Silicon Valley&#...

Stock 2026-01-12