The AI Boom: Three Secondary Concerns and One Primary Anxiety

Maybe it’s not a bubble on the verge of bursting, but nor is it sustainable

It’s strange to say, given that our planet has been exhaustively mapped and thoroughly explored, but we have somehow stumbled into uncharted territory.

No, I’m not talking about the life of the mind, though that realm always offers new vistas to the intrepid introvert; nor am I alluding to Elon Musk’s aspirations to colonize Mars, which will inevitably fail to meet the DOGEerer’s aggressive (delusional?) timeline.

Instead, the uncharted territory involves the profligate and seemingly ever-rising surge in spending on AI datacenters and the infrastructure that goes into them.

Analysts and historians of technology cite precedents and analogies for the remarkable phenomenon we’re currently witnessing. Alas, as I’ve said in this forum previously, analogies are necessarily slippery trinkets, sliding from our grasp just as we prepare to pocket them.

I grant that there are similarities between the past and the present — approximate patterns and trends have been known to recur — but just as doppelgängers remain the preserve of fiction, exact replays of the past are mostly tricks of the light. As we encounter something new, we try to match it with something we’ve seen before. Yet nothing that happens is ever quite the same as the events that preceded it. Fortunately, we can still learn from the past, if only to avoid making the same types of mistakes.

That’s why prediction is such a difficult hustle. We believe we can see what isn’t really there, and, distracted by the mirage, we miss what’s right in front of our eyes.

Everybody connected with the technology industry is wondering whether the AI boom is the biggest thing ever to happen in technology, or whether it’s something that will end in a bubble-bursting conflagration of epic proportions. Maybe it can be both. I don’t have a foolproof crystal ball, but I’m willing to aver that our fate will land somewhere between those polarities, neither an unmitigated disaster nor a blessed arrival in the best of all possible worlds. Nonetheless, I do envision some pain for the unwary and the unprepared.

Hubris Comes Before Humbling

I detest clichés because they’re an escape hatch that rescues us from having to engage in the effort of genuine thought, but sometimes axioms resonate like a clanging cow bell in an old rock anthem. The biblical injunction that “pride goeth before the fall” is absolutely on the nose. In my life experience, which we might characterize indulgently as ripe, my most dizzying falls occurred right after I had begun to believe that I could not put a foot wrong.

Confidence is necessary for any successful endeavor, but hubris is a passport to mortification. (A pet peeve of mine is the humblebrag that people evince when they use the word humbling when receiving adulation, awards, or plaudits. On such occasions, these people say they’re humbled, but they’re not. A humbling is what happens not when you’re feted, but when you’re fetid.)

As for why I think the AI market phenomenon is destined to receive some corrective discipline, the logic is relatively simple. It’s not because I don’t see value in AI. I see considerable long-term business value in AI. In fact, AI could easily become a valuable technology for industries and businesses; it might even become a useful tool for many consumers, while simultaneously failing to achieve the deliriously grandiose expectations that have been set for it by self-serving entrepreneurs, opportunists, and investors.

I’m on record as saying that there will be an AI market correction, shaving 10 or so percentage points off the market valuations of a few technology titans while wiping out a slew of johnny-come-lately startup companies. That said, I don’t know exactly when the correction will happen, and I do not believe AI will be obliterated in a smoking crater of acrid bubble gum. There’s substance to AI, just not as much as the most berserk propagandists would have you believe.

We live in an age of relentless hucksters, something that should be obvious to any astute student of contemporary society. We pay for nearly everything in this life, but the purveyors of bullshit incur no production costs and have no constraint on how much product they can supply. Bullshit is free to consume, though hazardous to one’s health. There should be a heath warning about bullshit on social media. Regulations are unfashionable these days, though, so bullshit and its criminal cousin, fraud, will reign supreme.

The Economist Voices Concerns

Fortunately, there are voices of reason among us. The skeptics at The Economist have raised some valid concerns about AI mania. The article is titled “The murky economics of the data-centre investment boom.” It probes the similarities and differences between what we’re witnessing today and what transpired during the 1990s telecom bubble. That bubble did burst, its remnants sluiced down the drain and gutter with the rest of the dotcom detritus in the early aughts of this exhaustingly eventful millennium.

According to The Economist, we should be concerned about three baleful factors. Well, there’s actually a fourth factor that’s bigger than all the others put together, but we’ll get to that in due course. Let’s first consider the three subsidiary concerns raised in the article.

The Economist lays out its concerns as follows:

. . . the centres’ remote locations, the non-public firms financing them and the weak credit quality of some borrowers. This trifecta reminds some sceptics of the last great infrastructure debacle: the telecoms boom of the late 1990s. Yet plenty of others are holding their noses and diving in.

Allow me to offer the following fortune-cookie counsel: Holding your nose might allow your olfactory senses to escape the repellent smell of what you’ve stepped into, but you’ll still have to scrape shit off the bottom of your shoe. (Maybe that life lesson is too prolix for a fortune cookie, and perhaps too profane.) Suspension of disbelief will only take you so far in polite company, especially if the smell emanating from your shoes is driving everybody in the opposite direction.

Excrement on shoes notwithstanding, let’s review the reservations cited by The Economist. First on the list: the remote locations of datacenters. Why is this a problem? Unlike cloud datacenters, newfangled AI datacenters are constructed “in the middle of nowhere rather than in established clusters close to big sources of demand and interconnection hubs, such as northern Virginia.” The locations of AI datacenters usually have ready access to energy sources, including solar and wind power, but they are otherwise remote.

The remote nature of these facilities is necessitated by heir prodigious energy requirements, which might be constrained in more populated areas. Still, the isolation of the new AI datacenters introduces investment risks, perhaps not adequately reflected in anticipated returns on investment.

Conventional cloud datacentrers, for example, are financed over decades, while AI datacenters are at risk of becoming obsolete faster, partly a result of how quickly the underlying technology is evolving. If you’re stuck with a vast, obsolescent property in the middle of nowhere, you might have trouble offloading it for any amount close to what you paid for it, especially if the AI datacenter market goes south — or southeast or southwest.

Stranded Assets and the Need for Corresponding Market Demand

A related factor, at least in the current overheated market conditions, is that competitors can also find wide-open spaces for their datacenters. With land readily available (for now), a competitor with a better, cheaper design, perhaps taking advantage of lower energy costs, can render your datacenter both outdated and inefficient. This dire scenario results in the dreaded stranded asset.

Ahh, stranded assets. They are a problem. Here’s a condensed definition of the term from Wikipedia:

Stranded assets are "assets that have suffered from unanticipated or premature write-downs, devaluations or conversion to liabilities" Stranded assets can be caused by a variety of factors and are a phenomenon inherent in the 'creative destruction' of economic growth, transformation and innovation; as such they pose risks to individuals and firms and may have systemic implications.


Let’s turn our attention to the financing challenge. We’re seeing a lot of circular financing in the AI space at the moment. That’s one discomfiting development. We’re also seeing startup companies and rogues’ galleries of creditors trying to keep pace with the capital expenditures of deeper-pocketed rivals. Some debtor companies, those dealing with unconventional creditors, are also dancing on the edge of a volcano. A juddering market, one that simply fails to reach the dizzying heights projected for it, will be enough to send them plummeting into the abyss. Others will get pulled down with them.

Those issues are dwarfed, however, by the largest problem of all: Paying customers who adopt AI services at sufficient scale and duration to cover the prodigious costs of datacenters and infrastructure. The Economist calls out that issue in the final paragraph of the article:

For now, the potential rewards are so tantalising that money is pouring in, he says. Cheerleaders such as Sam Altman, Openai’s boss, argue that the risks of underbuilding are at least as serious as those of overbuilding, because of the long-term economic potential of generative ai. It may be that even if there is a surplus of capacity in the most advanced data centres, it can be absorbed by running, rather than training, llms, Mr Sachdeva says. But that comes back to the question of when demand for generative-ai chatbots and applications catches up with the ambitions of those supplying them. That is the most bewitching uncertainty of all.

The problem isn’t that genAI, and whatever other variants of AI you want to include in the mix, isn’t attracting users and customers, some of the revenue-generating variety. The problem is that AI posits a dubious correlation between the amount (as opposed to the quality) of data processed and the value of commercial outcomes. That means training data proliferates to no end, datacenters are forced to expand to accommodate them, playing host to the latest and greatest GPUs and AI accelerators and other costly infrastructure. Through it all, energy requirements soar.

Endlessly mounting capital expenditures are unsustainable unless paying customers arrive in numbers and spendthrift extravagance that test the bounds of credible projections. Something is going to have to change to make the AI business proposition viable at scale. Before that happens, though, something is likely to break, perhaps not so badly that it can’t be repaired, but severely enough to shake some opportunists and undercapitalized newcomers from the gravy train. As noted above, they won’t be the only ones to get hurt.

Subscribe to Crepuscular Circus

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe