Nvidia Nvidia is poised to re-enter the Chinese market it once lost. On December 9th, according to the Global Times, citing Reuters, Bloomberg, and other media reports, US President Trump announced on social media on the 8th that the US government would allow Nvidia to sell its H200 artificial intelligence system to China. The company will purchase chips, but will charge a 25% fee for each chip. He also stated that the U.S. Department of Commerce is finalizing the details of the arrangement, and the same arrangement will apply to AMD. Company, Intel Other AI chip companies, etc.
Today, a reporter from CBN sought confirmation from Nvidia regarding this news. Nvidia responded, "Supplying H200 to commercial customers is a commendable move."
In July of this year, during an interview with CBN and other media outlets in China, Huang Renxun expressed his hope that more advanced chips would enter the Chinese market in the future. Now, this expectation may become a reality, although it comes with the condition of "charging a fee."
The H200 chip was first announced in November 2023 and shipped in 2024, offering performance several times that of the H20. Based on NVIDIA's "Hopper" architecture, the H200 was the first chip to adopt HBM3e (fifth-generation high-bandwidth memory). The GPU is designed for training and inference of today's generative AI models, running large-scale scientific computing, and processing massive amounts of data. The H200 offers 141GB of memory at 4.8 TB per second, nearly double the capacity and 2.4 times the bandwidth of the A100. Its inference speed on Llama 2 (a 70 billion parameter LLM) is twice as fast as the H100.
Previously, Nvidia's financial report for the second quarter of fiscal year 2026 showed that Nvidia's revenue for the quarter was $46.743 billion, of which $2.769 billion came from the Chinese market (excluding Taiwan). Compared with the $3.667 billion in revenue from the Chinese market in the second quarter of fiscal year 2025, this represents a decrease of nearly $900 million.
In a public event this October, Nvidia CEO Jensen Huang stated that US policies led to the loss of one of the world's largest markets. Nvidia's market share in China plummeted from 95% to 0%, effectively ending Nvidia's presence in the Chinese market. Huang described missing out on the Chinese artificial intelligence (AI) market as a "huge loss."
Jensen Huang also mentioned China's strength in the field of AI, calling China a unique market. The market's dynamism, innovation, development momentum, and the industry's growth rate are unparalleled. While the unforeseen consequences and long-term impacts of not participating in the Chinese market are difficult to predict, the outcome will not be optimistic. He predicts that the Chinese AI market may reach approximately $50 billion within the next two to three years. For an American company, being unable to enter this market would be a huge loss, and the Chinese AI market will progress regardless of Nvidia's presence.
Going forward, factors such as when customers in the Chinese market will place new orders, whether demand will change, and how much time the supply chain will take will affect Nvidia's earnings forecast. More importantly, during Nvidia's "absence," the Chinese market has changed—domestic AI chips have emerged. Manufacturers are rapidly emerging, and customers are finding alternative solutions. In a recent Tencent earnings call, Tencent executives stated that current GPUs are more than sufficient, and resources are not particularly strained. (Baidu) Executives recently stated that the vast majority of inference tasks within Baidu currently run on its own Kunlun P800 chip. Zhou Hongyi, founder of 360 Group, also told reporters that chip procurement has shifted to domestically produced chips. .
Meanwhile, Nvidia continues to face competitive pressure from other manufacturers of its own AI chips . Compared to the versatility of Nvidia GPUs, ASIC (Application-Specific Integrated Circuit) chips, represented by Google's TPUs, have some advantages, such as theoretically higher energy efficiency and performance. OpenAI competitor Anthropic announced in October its plan to utilize Google's computing power, planning to deploy up to 1 million Google TPU chips to train its large AI model, Claude. These TPU chips will be specifically designed to accelerate machine learning workloads and are planned for deployment in 2026.
(Article source: CBN)