Google’s Ambitious Plan to Build Solar-Powered AI Data Centers in Space

Google’s Ambitious Plan to Build Solar-Powered AI Data Centers in Space
An artist’s impression of the STEREO Observatory spacecraft shows it during solar panel deployment. Satellites operating in low-Earth orbit to support data center functions will require precise orbital management due to their low altitude. Credit: NASA/Johns Hopkins University Applied Physics Laboratory.

Google is exploring a bold new idea: what if the most efficient place to run massive AI systems isn’t on Earth at all, but in orbit?

This question has led to Project Suncatcher, a research initiative that imagines clusters of satellites—powered by near-constant sunlight—working together as space-based data centers. It’s an intriguing attempt to rethink how the world will meet the soaring energy demands of advanced artificial intelligence.

At its core, the project is driven by one simple reality: the sun delivers more than 100 trillion times the energy humanity produces, and solar panels in space can be up to eight times more productive than those on the ground. With no nighttime cycle, no clouds, and no atmosphere to block sunlight, satellites in the right orbit can operate almost continuously.

Google’s idea is to build a constellation of solar-powered satellites equipped with AI processors, linked to each other through laser-based optical communication. These satellites would fly in a sun-synchronous low Earth orbit, meaning they’d remain in nearly constant daylight. This location dramatically reduces the need for heavy batteries while maximizing energy generation. According to Google’s researchers, this could eventually support the kind of power-hungry computing that modern AI systems require—without relying on Earth’s increasingly strained energy and cooling infrastructure.

The Challenge of Building a Data Center in Orbit

As exciting as the concept is, the engineering hurdles are enormous. A data center—whether on Earth or in space—depends on two critical performance factors: high-speed communication and reliable processing hardware. Recreating both in orbit requires some clever innovation.

AI workloads rely on many processors working together, constantly exchanging data. That means satellites need to communicate at data center–grade bandwidth, ideally in the range of tens of terabits per second. Google’s research suggests this can be achieved using dense wavelength division multiplexing and spatial multiplexing—methods for dramatically increasing optical communication capacity.

But for this to work, the satellites must fly extremely close together, sometimes separated by less than a kilometer. This is not a trivial task at around 650 kilometers above Earth.

To determine whether these formations could be stable, Google ran detailed physics simulations factoring in atmospheric drag and the planet’s uneven gravitational field. The analysis suggests that, surprisingly, only modest station-keeping maneuvers would be needed. That means, with careful planning, the satellites could maintain tight formations without burning too much fuel.

Google has already conducted a bench-scale demonstration of the communication technology, achieving 1.6 terabits per second of total transmission—a promising early result and a key step toward proving the concept is feasible.

How Well Do AI Processors Survive in Space?

Space is notoriously harsh on electronics due to radiation, but Google’s testing offers some encouraging news. The company evaluated its Trillium v6e Cloud TPU, a modern AI chip designed for high performance. They found these chips can endure cumulative radiation nearly three times higher than what a five-year mission would expose them to before showing irregular behavior.

The more sensitive components were the High Bandwidth Memory (HBM) systems, which began to show issues after 2 kilorads of radiation—still significantly higher than the expected dose of 750 rads behind shielding over a five-year period. These results indicate that current TPU hardware can survive the environment long enough for multi-year missions.

The Cost Question: Can Space Compete with Earth?

Even if the engineering challenges are solved, the economics must also make sense. The biggest cost factor is launching the hardware into orbit. Google’s analysis shows the concept becomes financially competitive only if launch costs continue to fall.

According to the research paper, the tipping point is around $200 per kilogram, a price that could be achievable in the mid-2030s if launch providers maintain their current rate of innovation. At that price, running a space-based AI cluster could cost roughly the same as powering and cooling a large Earth-based data center—minus the massive energy footprint.

Why Move AI Compute Off-Planet?

The motivation isn’t just novelty. AI training and inference at scale demand unprecedented amounts of electricity and cooling infrastructure, straining grids and natural resources. Traditional data centers require land, fresh water, and enormous power draw. Moving AI compute to orbit would eliminate the need for cooling water, free up terrestrial land, and tap into uninterrupted clean energy.

There are also longer-term advantages:

  • Scalability: Orbit provides effectively unlimited expansion space.
  • Energy efficiency: Continuous solar power without environmental disruption.
  • Reduced carbon footprint: Fewer emissions from electricity generation and water usage.

Of course, rocket launches produce emissions, and orbiting hardware contributes to space debris, so the environmental trade-offs are complex.

A Broader Look at Space-Based Infrastructure

Google isn’t the first to experiment with optical communication or space computing. NASA’s OPALS experiment demonstrated the ability to send data via laser from the ISS to Earth. Private companies like SpaceX already operate enormous satellite constellations. And military organizations have explored formation flying for distributed sensing.

But Project Suncatcher stands out because it targets something much larger: full-scale AI computation in space. This concept transforms satellites from communication relays into powerful distributed computers.

If the idea works, it could inspire new industries:

  • Orbital cloud computing platforms
  • On-demand AI services powered by sunlight
  • Space manufacturing and robotic processing centers
  • Planet-scale computing fabrics spanning Earth and orbit

It’s the beginning of a potential shift in how humanity thinks about computation itself.

Technical Details Worth Noting

To cover every specific detail from the original study and news report:

  • Satellites operate in sun-synchronous low Earth orbit to maintain constant sunlight.
  • They use laser-based optical interlinks for high-speed communication.
  • Communication requires tens of terabits per second for distributed AI tasks.
  • Google validated 1.6 Tbps in a bench demonstration.
  • Satellites must fly in very tight formations—distances of under 1 km.
  • Simulations suggest only modest station-keeping is required at the 650 km altitude.
  • Google tested the Trillium v6e Cloud TPU, which tolerates radiation nearly the expected mission dose.
  • HBM systems showed issues after 2 kilorads, still above the expected 750 rad dose.
  • Project viability strongly depends on launch costs dropping to below $200/kg by the mid-2030s.

These details form the backbone of the project’s feasibility assessment.

The Road Ahead

Project Suncatcher is still in the research phase. Google’s paper emphasizes that the idea is early-stage and that many challenges remain unsolved. But the exploration is serious and grounded in physics, engineering tests, and long-term modeling.

If launch costs fall, if satellite formations can be controlled safely, and if AI workloads work efficiently in orbit, then space-based computing could be a real part of future infrastructure.

It’s a fascinating glimpse into how AI might reshape not just the world—but the space around it.

Research Paper:
https://arxiv.org/abs/2511.19468

Also Read

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments