As artificial intelligence (AI) models keep growing and getting more power-hungry, researchers are starting to ask not whether they can be trained — but where. That’s the context behind Google Research’s recent proposal to explore space-based AI infrastructure, an idea that sits somewhere between serious science and orbital overreach.
The idea, dubbed “Project Suncatcher” and outlined in a study uploaded Nov. 22 to the preprint arXiv database, explores whether future AI workloads could be run on constellations of satellites equipped with specialized accelerators and powered primarily by solar energy.
The push to look beyond Earth for AI infrastructure isn’t coming out of nowhere. Data centers already consume a non-trivial slice of the world’s power supply: recent estimates put global data-center electricity use at roughly 415 terawatt-hours in 2024, or about 1.5% of total global electricity consumption, with projections suggesting this could more than double by 2030 as AI workloads surge.
Utilities in the U.S. are already planning for data centers, driven largely by AI workloads, to account for between 6.7-12% of total electricity demand in some regions by 2028, prompting some executives to warn that there simply “isn’t enough energy on the grid” to support unchecked AI growth without significant new generation capacity.
In that context, proposals like space-based data centers start to read less like sci-fi indulgence and more like a symptom of an industry confronting the physical limits of Earth-bound energy and cooling. On paper, space-based data centers sound like an elegant solution. In practice, some experts are unconvinced.
Reaching for the stars
Joe Morgan, COO of data center infrastructure firm Patmos, is blunt about the near-term prospects. “What won’t happen in 2026 is the whole ‘data centers in space’ thing,” he told Live Science. “One of the tech billionaires might actually get close to doing it, but aside from bragging rights, why?”
Morgan points out that the industry has repeatedly flirted with extreme cooling concepts, from mineral-oil immersion to subsea facilities, only to abandon them once operational realities bite. “There is still hype about building data centers under the ocean, but any thermal benefits are far outweighed by the problem of replacing components,” he said, noting that hardware churn is fundamental to modern computing.
That churn is central to the skepticism around orbital AI. GPUs and specialized accelerators depreciate quickly as new architectures deliver step-change improvements every few years. On Earth, racks can be swapped, boards replaced and systems upgraded continuously. In orbit, every repair requires launches, docking or robotic servicing — none of which scale easily or cheaply.
“Who wants to take a spaceship to update the orbital infrastructure every year or two?” Morgan asks. “What if a vital component breaks? Actually, forget that, what about the latency?”
Latency is not a footnote. Most AI workloads depend on tightly coupled systems with extremely fast interconnects, both within data centers and between them. Google’s proposal leans heavily on laser-based inter-satellite links to mimic those connections, but the physics remains unforgiving. Even at low Earth orbit, round-trip latency to ground stations is unavoidable.
“Putting the servers in orbit is a stupid idea, unless your customers are also in orbit,” Morgan said. But not everyone agrees it should be dismissed so quickly. Paul Kostek, a senior member of IEEE and systems engineer at Air Direct Solutions, said the interest reflects genuine physical pressures on terrestrial infrastructure.
“The interest in placing data centers in space has grown as the cost of building centers on earth keeps increasing,” Kostek said. “There are several advantages to space-based or Moon-based centers. First, access to 24 hours a day of solar power… and second, the ability to cool the centers by radiating excess heat into space versus using water.”
From a purely thermodynamic standpoint, those arguments are sound. Heat rejection is one of the hardest limits on computation, and Earth-based data centers are increasingly constrained by water availability, grid capacity and local environmental opposition.
The backlash against terrestrial AI infrastructure isn’t limited to energy and water issues; health fears are increasingly part of the narrative. In Memphis, residents near xAI’s massive Colossus data center have voiced concern about air quality and long-term respiratory impacts, with community members reporting worsened symptoms and fear of pollution-linked illnesses since the facility began operating. In other states, opponents of proposed hyperscale data center projects have framed their resistance around potential health and environmental harms, arguing that large facilities could degrade local air and water quality and exacerbate existing public health burdens.
Putting data centers into orbit would remove some constraints, but replace them with others.
Staying grounded
“The technology questions that need to be answered include: Can the current processors used in data centers on Earth survive in space?” Kostek said. “Will the processors be able to survive solar storms or exposure to higher radiation on the Moon?”
Google researchers have already begun probing some of those questions through early work on Project Suncatcher. The team describes radiation testing of its Tensor Processing Units (TPUs) and modeling of how tightly clustered satellite formations could support the high-bandwidth inter-satellite links needed for distributed computing. Even so, Kostek stresses that the work remains exploratory.
“Initial testing is being done to determine the viability of space-based data centers,” he said. “While significant technical hurdles remain and implementation is still several years away, this approach could eventually offer an effective way to achieve expansion.”
That word — expansion — may be the real clue. For some researchers, the most compelling rationale for off-world computing has little to do with serving Earth-based users at all. Christophe Bosquillon, co-chair of the Moon Village Association’s working group for Disruptive Technology & Lunar Governance, argues that space-based data centers make more sense as infrastructure for space itself.
“With humanity on track to soon establish a permanent lunar presence, an infrastructure backbone for a future data-driven lunar industry and the cis-lunar economy is warranted,” he told Live Science.
From this perspective, space-based data centers aren’t substitutes for Earth’s infrastructure so much as tools for enabling space activity, handling everything from lunar sensor data to autonomous systems and navigation.
“Affordable energy is a key issue for all activities and will include a nuclear component next to solar power and arrays of fuel cells and batteries,” Bosquillon said, adding that the challenges extend well beyond engineering to governance, law and international coordination.
Crucially, space-based computing could offload non-latency-sensitive workloads from Earth altogether. “Solving the energy problem in space and taking that burden off the Earth to process Earth-related non-latency-sensitive data… has merit,” Bosquillon said, even extending to the idea of space and the Moon as a secure vault for “civilisational” data.
Seen this way, Google’s proposal looks less like a solution to today’s data center shortages and more like a probe into the long-term physics of computation. As AI approaches planetary-scale energy consumption, the question may not be whether Earth has enough capacity, but whether researchers can afford to ignore environments where energy is abundant but everything else is hard.
For now, space-based AI remains strictly experimental. Whether it ever escapes Earth’s gravity may depend less on solar panels and lasers than on how desperate the energy race becomes.













