There's a big problem with AI that just isn't getting talked about enough. The
vast majority of stuff we read about AI is how it's going to help with every
problem, or take every job, depending on which way they look at it.
We're all going to have an AI servant, or Master, again depending on which way
they look at it.
The real issue, the existential problem is that AI sucks so much power,
they're going to black out the rest of humanity. It will never get to the point of being really useful without truly enormous increases in power generation. Perhaps you've seen the
story at
Microsoft has repaired and gotten the Three Mile Island nuclear power
station
back online for their AI power needs - and will get exclusive access to all
the power it can generate. There have been similar stories around the
world.
Friday, we learned that
Eric Schmidt - the former CEO of Google - has bought Relativity space
apparently in preliminary design for data centers in orbit, to get access to
the solar power.
We know this because Schmidt appeared before the House Committee on Energy
and Commerce
during a hearing in
April, speaking on the future of AI and US competitiveness. Among the topics
raised then was the need for more electricity—both renewable and
non-renewable—to power data centers that will facilitate the computing needs
for AI development and applications. Schmidt noted that an average nuclear
power plant in the United States generates 1 gigawatt of power.
Then he rattled off some numbers that might blow your mind or cause your eyes
to bleed.
"People are planning 10 gigawatt data centers," Schmidt said. "Gives you a
sense of how big this crisis is. Many people think that the energy demand
for our industry will go from 3 percent to 99 percent of total generation.
One of the estimates that I think is most likely is that data centers will
require an additional 29 gigawatts of power by 2027, and 67 more gigawatts
by 2030. These things are industrial at a scale that I have never seen in my
life."
AI applications consume an enormous amount of computing power. A single
ChatGPT query consumes
approximately 10 times more
energy than a Google search does. The US energy industry is not well
prepared for this kind of dramatic growth in energy demand, as power
consumption over the last decade has increased by about 0.5 percent a year.
Data centers also consume significant amounts of water for cooling.
There is no way our society could get 29 gigawatts more power generation from
reactors, burning gas, oil or anything built in two years (by
'27). Which says another 67 more gigawatts in three more years
just isn't happening, either. Which raises the reasonable question
of how does Eric Schmidt get there? First off, he doesn't need to.
He needs to power his business, not the entire AI industry.
If he's going to try to build power plants and data centers in space, that
spawns a bunch of other problems, starting with how to get that much stuff
into space - which seems to explain why Schmidt bought Relativity Space.
He needs big, reusable rockets and a lot of flights of them. The biggest
rocket, although not yet operational, is Starship. I think the odds of
him buying a huge number of Starship flights aren't very good, but while it's a small
number, it's better than the odds of him buying SpaceX itself. Yeah, no.
The Blue Origin's New Glenn is smaller, not a whole lot closer to flying lots
of missions than Starship and Blue is also owned by a billionaire. Again, not
likely. ULA's Vulcan rocket is expensive, and it's already behind its
predicted launch manifest. Rocket Lab's Neutron vehicle is coming soon, but
it's the smallest of these and may not be large enough for Schmidt's
ambitions.
That leaves the Terran R being developed by Relativity Space,
which has appeared here
a couple of times.
If fully realized, Terran R would be a beastly launch vehicle capable of
launching 33.5 metric tons to low-Earth orbit in expendable mode—more than a
fully upgraded Vulcan Centaur—and 23.5 tons with a reusable first
stage. If you were a billionaire seeking to put large data centers
into space and wanted control of launch, Relativity is probably the only
game in town.
I know of no visualization graphics for orbiting power farms that would be
powered by photovoltaics and radiate their waste heat into space, although one
would be great here.
Solving launch is just one of the challenges this idea faces, of course. How
big would these data centers be? Where would they go within an increasingly
cluttered low-Earth orbit? Could space-based solar power meet their energy
needs? Can all of this heat be radiated away efficiently in space?
Economically, would any of this make sense?
These are not simple questions. But Schmidt is correct that the current
trajectory of power and environmental demands created by AI data centers is
unsustainable. It is good that someone is thinking big about solving big
problems.
Relativity Space's current computer rendering of their Terran R, full vehicle
and landed booster. Pretty much impossible to guess the size. Image Credit: Relativity Space