On February 20, 2026, during an interview at the India AI Impact Summit in New Delhi, OpenAI CEO Sam Altman dismissed the idea of putting AI data centers in space as “ridiculous” for the foreseeable future.
Altman’s comments were a sharp rebuttal to the vision of his rival, Elon Musk, who recently announced plans to merge SpaceX and xAI to build orbital computing facilities.
The “Rough Math” of Space Compute
Altman argued that while space-based data centers sound futuristic, they fail basic economic and logistical tests:
- Prohibitive Costs: He pointed out that the “rough math” of launch costs versus the cost of power on Earth makes orbital centers unviable. Even with cheaper launches, the overhead of getting heavy GPU racks into orbit remains extreme.
- The Repair Problem: One of Altman’s most grounded critiques was the lack of serviceability. “It is very hard to fix a broken GPU in space,” he noted, highlighting that a single hardware failure in orbit could render millions of dollars of equipment useless.
- Scale Timeline: Altman concluded that orbital data centers are “not something that’s going to matter at scale this decade.”
The Rivalry: Altman vs. Musk vs. Pichai
The debate over where to “park” the world’s GPUs has split the tech elite into three distinct camps:
| Leader | Vision for Data Centers |
| Elon Musk (SpaceX/xAI) | Pro-Space: Wants to launch a constellation of up to one million satellites to act as orbital data centers, claiming space is the “lowest-cost place” for AI within 3 years. |
| Sundar Pichai (Google) | Moonshot: Project Suncatcher aims to launch prototype “tiny racks” of machines by 2027 to test solar energy harvesting in orbit. |
| Sam Altman (OpenAI) | Terrestrial: Dismisses space as a distraction. Instead, he is doubling down on Earth-based infrastructure (like the $500B Stargate Project) powered by nuclear and renewable energy. |
Defending AI’s Energy Hunger
Beyond the space debate, Altman used the summit to defend the environmental impact of his Earth-based centers. He made a controversial comparison that has since gone viral:
“It also takes a lot of energy to train a human. It takes 20 years of life, and all the food you eat before that time, before you get smart.” — Sam Altman, Feb 20, 2026.
He argued that when you account for the “100 billion people who have ever lived” to build cumulative human knowledge, AI models are actually becoming more energy-efficient on a “per-task” basis than biological intelligence.
OpenAI’s Infrastructure Reality
While he mocked the space idea, Altman’s own ground-based plans are facing scrutiny. Reports from February 21, 2026, indicate OpenAI has scaled its projected compute spend through 2030 down to $600 billion (from a previously touted $1.4 trillion) as it prepares for a potential IPO and faces internal friction over the “Stargate” buildout in Texas.
