In addition to hardware costs, the demand for electricity production, transmission, and cooling will also be a major driver of large-scale artificial intelligence development in the coming years. Data Center The main constraints faced. In light of this, Elon Musk recently proposed a disruptive vision: deploying AI computing centers in space.
Musk is involved in AI, SpaceX, and Tesla. The former two are engaged in AI large-scale model development and commercial aviation, respectively, while Tesla is involved in electric vehicles and energy storage. and robots These are just some of his many businesses. Connecting these businesses together would provide a near-closed-loop support for his vision, and if successful, his company would likely be the biggest beneficiary.
Why this vision?
Musk believes that running large-scale artificial intelligence systems in orbit will be more cost-effective than running similar systems on Earth within the next four to five years. This is largely due to "free" solar energy. And relatively easy-to-implement cooling technologies.
He previously stated at the US-Saudi Investment Forum: "I estimate that before Earth's potential energy resources are exhausted, the cost of electricity and the cost-effectiveness of artificial intelligence in space will far outweigh the current terrestrial AI. I believe that, even within a four- to five-year timeframe, the lowest-cost AI computing method will be using solar-powered AI satellites."
“I think it won’t be more than five years from now,” he added.
Musk emphasized that as computing clusters grow, the combined demand for power supply and cooling will escalate to a point where ground-based infrastructure struggles to keep up. He claimed that achieving a sustained computing capacity of 200 to 300 gigawatts per year would require building massive and expensive power plants, as a typical nuclear power plant generates approximately 1 gigawatt of electricity continuously.
Meanwhile, the United States currently has a continuous power generation capacity of approximately 490 gigawatts (note that while Musk said "per year," he meant continuous power generation over a specific period), making it impossible to dedicate most of it to artificial intelligence. Musk stated that any AI-related electricity demand approaching the terawatt level is unattainable within the global power grid.
"You can't build a power plant of that scale: say, 1 terawatt of continuous power generation, it's simply impossible. You have to do it in space. In space, you can utilize continuous solar energy , and you don't actually need batteries." Because space is always bathed in sunlight, and solar energy... " The solar panels will actually be cheaper because you don't need glass or a frame, and the cooling is just radiative cooling," he explained.

Musk's plans
Musk's core plan is to deploy 100 gigawatts of solar-powered AI satellites in orbit every year, a scale equivalent to a quarter of the nation's electricity.
He posted on November 19: “Starship should be able to send about 300 gigawatts, or even 500 gigawatts, of solar-powered AI satellites into orbit each year.” He added that at this rate, the computing power of orbital AI could exceed the total electricity consumption of the United States within a few years—averaging about 500 gigawatts.
This is not just a matter of launch hardware; it is a significant step toward what Musk describes as a "Kardashev II civilization," a theoretical milestone referring to a society capable of harnessing the energy output of an entire star.
According to posts on X, Musk has repeatedly linked Starship's capabilities to this scale, pointing out that the energy levels that space-based solar power can utilize are "more than a billion times" greater than all of Earth's resources combined. This concept builds upon ideas like the "Dyson sphere," but Musk's version focuses on an artificial intelligence satellite constellation that can process data while utilizing unlimited solar energy.
However, according to Musk, "there is one key link that is holding it back." This link is likely scaling up production and orbital assembly.
However, some analysts point out that these satellites will not remain idle and drifting; they will form a solar-powered network of computing nodes. According to a report released by PCMag earlier this month, this concept is similar to a "Dyson sphere" composed of satellites that can utilize solar energy and even cool the Earth by blocking sunlight, thus aiding in climate control.
Musk previously wrote on X: "Ultimately, solar-powered AI satellites are the only way to achieve a Kardashev II civilization."
Furthermore, to reach the annual power generation cap of 300-500 gigawatts, Musk also suggested manufacturing on the moon. In an article published on X on November 2, 2025, he stated, "A lunar base could produce 100 terawatts of electricity per year, and the base could manufacture solar-powered AI satellites on-site and accelerate them to escape velocity using mass drives."
It's still just a dream.
Despite Musk's extremely optimistic outlook, the road ahead is fraught with obstacles. Orbital debris, regulatory approvals, and international space policy all pose risks. Nvidia CEO Jensen Huang commented, "This is a dream."
In theory, space is an ideal location for power generation and cooling electronic devices, as temperatures in the shadows can drop as low as -270°C. However, the reality is not so simple. For example, temperatures under direct sunlight can reach as high as +120°C.
However, temperature fluctuations are much smaller in Earth's orbit: -65°C to +125°C in Low Earth Orbit (LEO), -100°C to +120°C in Medium Earth Orbit (MEO), -20°C to +80°C in Geostationary Orbit (GEO), and -10°C to +70°C in High Earth Orbit (HEO).
LEO and MEO are unsuitable as "space data centers " due to their unstable lighting patterns, intense thermal cycling, crossing radiation belts, and frequent solar eclipses. GEO is more feasible because it receives ample sunlight year-round (although solar eclipses occur annually, they are short-lived) and has lower radiation levels.
However, even in geostationary orbit, building large-scale artificial intelligence data centers faces significant challenges: megawatt-level GPU clusters require enormous heat dissipation wings to dissipate heat solely through infrared radiation. This means that each gigawatt-level system would require tens of thousands of square meters of deployable structure, far exceeding the capabilities of any aircraft to date.
Furthermore, launching such a massive scale would require thousands of Starship-class flights, which is unrealistic within Musk's target of four to five years and would be extremely costly.
Furthermore, high-performance AI accelerators like Blackwell or Rubin, along with their associated hardware, cannot function properly under the radiation of GEO orbit without heavy shielding or thorough radiation-hardening modifications. These modifications would significantly reduce clock frequencies and/or require entirely new process technologies that substantially improve radiation resistance, rather than simply optimizing performance. This would reduce the feasibility of building AI data centers on GEO.
Furthermore, considering the scale of the proposed project, technologies such as high-bandwidth connectivity with Earth, autonomous maintenance, debris avoidance, and robotic maintenance are still in their infancy. This may explain why Huang Renxun described it all as currently just a "dream."
(Article source: CLS)