When Google uses the word moonshot, it usually means something big — self-driving cars, quantum computers, or now… space-based AI. In their latest research initiative, Project Suncatcher, Google imagines a future where machine learning doesn’t just run in cloud data centers, but across a constellation of solar-powered satellites orbiting the Earth. It’s bold, bizarre, and maybe brilliant — and it could redefine what “scalable AI” really means.
Google’s Project Suncatcher: Building AI Data Centers in Space
Google Research has unveiled an ambitious new initiative called Project Suncatcher — a long-term experiment exploring whether AI computing could one day leave Earth’s surface entirely. The vision: compact, solar-powered satellites equipped with Google’s custom Tensor Processing Units (TPUs), networked together through laser-based optical links to form a massive, scalable AI infrastructure in orbit.
Why Space?
AI consumes staggering amounts of energy, and our data centers on Earth are hitting physical and environmental limits. The Sun, meanwhile, delivers more power than we could ever use — and in orbit, solar panels can be up to eight times more efficient than those on the ground. A satellite constellation in the right orbit could harvest that energy almost continuously, running AI workloads 24/7 without the same carbon footprint or land constraints as Earth-based systems.
How It Works
The newly released preprint, “Towards a Future Space-Based, Highly Scalable AI Infrastructure System Design,” outlines how such a system could function and what needs to be solved to make it real. The core idea is to build “constellations” of satellites operating in a sun-synchronous orbit, where they’re almost always in sunlight.
Each satellite would contain TPUs and communicate with its neighbors via free-space optical links capable of tens of terabits per second. Google has already tested a bench-scale system that hit 1.6 Tbps — proof that data-center-grade connectivity in space isn’t out of reach.

Keeping the Constellation Together
Flying satellites only a few hundred meters apart is a massive engineering challenge. Google’s team modeled the orbital mechanics using JAX-based simulations and Hill-Clohessy-Wiltshire equations to predict how small clusters could maintain stable formations. The takeaway: with smart design and minimal fuel use, it’s possible to keep these “mini data centers” floating in sync.
Can TPUs Survive in Space?
To find out, Google tested its latest Trillium v6e TPU under a high-energy proton beam to simulate cosmic radiation. The chips held up remarkably well — showing only minor effects even after radiation doses three times higher than expected over a five-year mission. The result? TPUs are already surprisingly radiation-hardened, meaning they might not need major redesigns for space deployment.
The Economics of Orbit
Historically, launch costs have been the dealbreaker. But those costs are falling fast. Google’s projections suggest that by the mid-2030s, getting hardware into orbit could cost less than $200 per kilogram. At that point, the total cost of a space-based data center could be comparable to running one on Earth, especially once energy savings are factored in.
Next Steps: Testing by 2027
To bring theory into practice, Google is partnering with Planet Labs on a prototype mission slated for 2027. Two small satellites will test TPU performance and validate optical inter-satellite communication during real machine learning tasks. If it works, it could be the first step toward a new kind of cloud — one that literally floats above the planet.
A True Moonshot
Project Suncatcher is still early research, but it’s classic Google: a mix of high risk, high imagination, and serious engineering. Like their early quantum computing or autonomous vehicle work, this isn’t about what’s possible today — it’s about stretching the horizon of what’s next.
And if the idea of AI powered by sunlight, orbiting silently above the Earth, sounds like science fiction… well, that’s exactly where the future tends to start.