
This is a post about AI, whose proponents are downright messianic in describing it as the technology of the future. Maybe. But much of their advocacy seems to ignore some mundane limits to AI’s growth — limits I’ll try to illustrate by talking about a technology of the past.
I was probably 9 or 10 when my father took me to a Horn & Hardart automat. For those too young to remember — who I hope are a large fraction of my readers — these were establishments in which a variety of sandwiches and other foods were displayed behind glass doors. You would serve yourself by putting coins into a slot, which would unlock the door and let you extract your egg salad sandwich or whatever.
At the time (and at my age) it seemed wonderfully futuristic: Food service without people! In reality, of course, automats weren’t automated; each required a substantial staff to operate the kitchen and keep refilling those glass-doored compartments. And because automats weren’t all they pretended to be, they were eventually driven out of business by the rise of fast food.
Many applications of information technology are, like the automats of yore, less miraculous than they seem. True, the user experience makes you feel as if you’ve transcended the material world. You click a button on Amazon’s web site and a day or two later the item you wanted magically appears on your porch. But behind that hands-free experience lie a million-strong workforce and a huge physical footprint of distribution centers and delivery vehicles.
And the disconnect between the trans-material feel of the consumer experience and the physical realities that deliver that experience is especially severe for the hot technology of the moment, AI. We’re constantly arguing about whether AI is a bubble, whether it can really live up to the hype. We don’t talk enough about AI’s massive use of physical resources, especially but not only electricity.
And we certainly don’t talk enough about (a) how U.S. electricity pricing effectively subsidizes AI and (b) the extent to which limitations on generating capacity may nonetheless severely limit the technology’s growth.
How much generating capacity are we talking about? The Department of Energy estimates that data centers already consumed 4.4 percent of U.S. electricity in 2023, and expects that to grow to as much as 12 percent by 2028:
AI isn’t the only source of rising electricity demand from data centers. There are other drivers including, alas, crypto — which still has no legitimate use case, but now has powerful political backing. But Goldman Sachs believes that AI will account for a large fraction of rising data center demand:
With Sam Altman of OpenAI promising to spend “trillions” on data centers in the near future — and sneering at economists who, he imagines, are wringing their hands — I wouldn’t be surprised to see demand come in at the high end of the Department of Energy’s projections. True, the AI bubble might burst before that happens, with potentially ugly consequences for the wider economy. But that’s a subject for another post.
So suppose that AI really does consume vast quantities of electricity over the next few years. Where are all those kilowatt-hours supposed to come from?
America is, of course, adding generating capacity as you read this, and can accelerate that expansion if it chooses to. But there are two big obstacles to any attempt to keep up with the demand from AI.
The first is that in recent years growth in U.S. generating capacity has become increasingly dependent on growth in renewable energy. According to S&P Global, almost 90 percent of the generating capacity added in the first 8 months of 2024 came from solar and wind:
Why is this a problem? Because Donald Trump and his minions have a deep, irrational hatred for renewable energy. Not only have they eliminated many of the green energy subsidies introduced by the Biden administration, they have been actively trying to block solar and wind projects.
So even as Trump promises to make America dominant in AI, he’s undermining a different cutting-edge technology — renewable energy — that is crucial to AI’s growth.
Suppose that electric utilities manage somehow to get around Trump’s anti-technology roadblocks and build the extra generating capacity. Who will pay for all that spending? The answer, given the way we regulate these utilities — and as natural monopolies, they must be regulated — is that the cost of adding capacity to power data centers is passed on to ordinary customers who have nothing to do with AI. This is already happening: Over the past 6 months retail electricity prices have risen at a 9 percent annual rate, four times as fast as overall consumer prices.
Last week the watchdog for PJM Interconnection LLC, the nation’s largest grid, declared that this must stop, that it “recommends that large data centers be required to bring their own generation.”
Indeed, requiring that the AI industry take responsibility for the costs it imposes makes a lot of sense. It would by no means end progress in AI. As the website Tech Policy notes, there are many AI applications in which smaller, more focused models can perform almost as well as the bloated, all-in-one models currently dominating the field, while consuming far less energy. Until now there has been no incentive to take energy consumption into account, but there’s every reason to believe that we could achieve huge efficiency gains at very low cost.
But will we do the sensible thing? It’s obvious that any attempt to make AI more energy-efficient would lead to howls from tech bros who believe that they embody humanity’s future — and these bros have bought themselves a lot of political power.
So I don’t know how this will play out. I do know that your future electricity bills depend on the answer.
Reprinted with permission from Substack.