We’re in a Strange New World
And we haven’t even figured out how to use AI gainfully yet
I try to inform and entertain here, but I’ll always err on the side of the former. It’s not that I don’t want to entertain, just that I don’t want to do it at the expense of analytical purpose.
Superlatives get the bum’s rush at this establishment. If you want breathy hyperbole and unhinged fantasy, you’ll have to seek them elsewhere. You want to search long or far to find them. This venue offers serious amusements — oxymoronic, perhaps, but a step up from moronic nonetheless. I have too much respect for my readers to push fetid cheese under their noses and garish dreamscapes into their minds. Each of us receives our daily allowance of bullshit; in fact, we get more than enough to fertilize crops stretching well into the horizon. That, of course, presumes that datacenters aren’t already occupying our fields. At the rate they are being built by our AI overlords, datacenters will eventually claim every bit of land, arable or otherwise, that can be purchased and zoned for the purpose.
With that prologue in the books, let’s look at the somewhat ambiguous deal recently announced by Nvidia and OpenAI. Describing exactly what it is and what it entails are difficult propositions, readily conceded by the principals, who report that the specifics have yet to be defined. The nominal amount, $100-billion, grabs all the headlines, but whether it comes to that is another matter. There’s an intentional imprecision to the transaction between Nvidia and OpenAI that makes it a sign of the times.
I remember a time, not that long ago, when deals were not announced and sent over the wire until those making the announcement felt the cake was fully baked, with all the secondary clauses and details nailed down. Back then, the principals were concerned about substance as much as they prioritized appearances. When they made their announcements, they wanted to be in position to answer all the questions that stakeholders and media might wish to ask.
Now, though, we live in a time when appearance and the one arts of performative business take precedence. Things are announced that are nothing more than exoskeletons, yet to be draped and clothed in the finery of detail. Maybe I’m being overly scrupulous, and none of this matters. Why not just go with the ever-changing flow? In the context of eternity, none of it matters anyway, right? Roll out the barrel and party while you can, sup on the heady draft of contingent financing agreements.
It’s Not the Heat, It’s the Vapidity
It’s the frenzied adverbs and adjectives that really get to me, though.
There isn’t a day that passes when I shake my head several times in disbelief when reading the direct quotes of industry leaders discussing the business prospects of AI. It’s as if all reasonable restraint has been cast aside. CEOs, and sometimes even CFOs, are behaving like carnival barkers or professional-wrestling impresarios rather than senior executives with a fiduciary responsibility to investors, employees, and other stakeholders. I suppose attention spans and memories are short these days, and everybody knows it. . . . or forgets it.
In the professed $100-billion deal between Nvidia and OpenAI, as well as in other recent announcements, intemperate commentary from the dais causes observers, such as this one, to wonder whether we’ve entered an alternate reality. Maybe we’re living in a simulation; or, considering that the latest Rapture passed without incident (I think) earlier this week, perhaps we’ve all been consigned to digital purgatory.
The claims coming from the executives are increasingly outlandish, making it difficult to discern the underlying reality.
Let’s start with something basic. Can you tell me exactly what OpenAI is these days? Is it still a non-profit, or a for-profit company, or an amalgam of the two, or is it something else entirely? Perhaps it has broken the corporate mode, creating a new type of entity that is overwhelmingly driven by avarice and profit while covered by a permeable carapace of surface altruism. Further, is OpenAI now a provider of genAI, agentic AI, and (aspirationally, very aspirationally) artificial general intelligence (AGI), or is it more than that? Is it also in the datacenter or cloud business? Does it now compete against Apple, Google, and others in the next-generation mobile-device space? Is it both a supplier to enterprises and consumers, or one more than the other? These questions and others are subject to speculation and change. OpenAI’s relationship with Microsoft, its hitherto big-bucks benefactor, is now also in a state of flux. Many questions seem applicable, and the answers are as elusive time melting on surrealist clocks.
You might say I’m being a churl, a teetotal at the bacchanal. I would object that I’m merely asking the sorts of questions that have always been asked, diligently and responsibly, by those of us who whose vocation involved observing, analyzing, and making sense of industry developments.
What exactly is this deal between Nvidia and OpenAI? I see a lot of handwaving and sweeping generalities, but the specifics are thin on the ground.
“Smart People Will Get Overexcited”
OpenAI’s Sam Altman says he understands why you might think that what you’re watching unfold appears a little weird, but he advises sanguinity, and not of the kind that involves copious bloodletting. Here’s an excerpt from a CNBC report on the initiation of a major datacenter projects near Abilene, Texas:
"This is what it takes to deliver AI," Altman said. "Unlike previous technological revolutions or previous versions of the internet, there's so much infrastructure that's required, and this is a small sample of it."
The biggest bottleneck for AI isn't money or chips — it's electricity. Altman has put money into nuclear companies because he sees their steady, concentrated output as one of the only energy sources strong enough to meet AI's enormous demand.
Altman led a $500 million funding round into fusion firm Helion Energy to build a demonstration reactor, and backed Oklo, a fission company he took public last year through his own SPAC.
Critics warn of a bubble, pointing to how companies like Nvidia, Oracle, Broadcom and Microsoft have each added hundreds of billions of dollars in market value on the back of tie-ups with OpenAI, which is burning cash. Nvidia and Microsoft are now worth a combined $8.1 trillion, or equal to about 13.5% of the S&P 500.
Skeptics also say the system looks like a circular financing model. OpenAI is committing hundreds of billions of dollars to projects that rely on partners like Nvidia, Oracle, and SoftBank. Those companies are simultaneously investing in the same projects and then getting paid back through chip sales and data center leases.
Friar has a different perspective, arguing that the entire ecosystem is banding together to meet a historic surge in compute needs. Big tech booms, Friar noted, have always required this kind of bold, coordinated infrastructure buildout.
Altman added that such cycles of overinvesting and underinvesting have marked every past technological revolution. Some people, he said, will surely feel the pain.
"People will get burned on overinvesting and people also get burned on underinvesting and not having enough capacity," he said. "Smart people will get overexcited, and people will lose a lot of money. People will make a lot of money. But I am confident that long term, the value of this technology is going to be gigantic to society."
Questions Outnumber Answers
Look at a few of the words excerpted above. You’ll see a liberal sprinkling of hyperbole and superlatives, implements of the inveterate salesman and the relentless huckster. Notice, however, that Altman hedges his bets. He admits that some people will win and others will lose, some will make money and others will lose their shirts (and perhaps other items of clothing). These qualifiers allow him to retain a measure of credibility should the excrement hit the fan in the months and years to come. I also notice that he flexes his temporal prerogative, arguing that in the long run — how long, he doesn’t say — it will all work out for the best. That might be cold comfort to investors and employees who dumped on the side of the interstate after the first extravagant surge of datacenter intemperance plays out.
Resisting the wave of intoxicating euphoria, stiff-arming the party-drug high, Reuters published an article noting that Nvidia’s unconventional deal with OpenAI yielded more questions than answers. Reuters went on to enumerate some of the most pertinent questions. All the questions posed by Reuters are valid, and there are scores more we might want to ask. The hard part, of course, is getting answers. The questions are abundant, the answers in short supply.
That Reuters article contains some interesting tidbits nonetheless. Consider the following:
In an earnings call in August, Nvidia CEO Jensen Huang said that AI data centers cost about $50 billion per gigawatt of capacity to build out, with about $35 billion of that money going toward Nvidia's chips and gear.
Nvidia has committed to investing in OpenAI to help it build 10 gigawatts of data center capacity, or about $10 billion per gigawatt. That leaves about $40 billion in additional capital required for each gigawatt of capacity OpenAI plans to build. OpenAI has not signaled whether it agrees with Huang's cost estimates or, if it does, where it would procure the additional funds.
OpenAI did not return a request for comment about its funding plans.
All Aboard the Money-Go-Round
For my money, which doesn’t remotely approach the investment sums Nvidia has apparently committed to OpenAI, this is the key to understanding the transaction. OpenAI needs money, and it needs Nvidia’s chips, which can be purchased with money.
Similarly, Nvidia needs to lock down a long-term source of customer revenue large enough to make up for any inherent volatility in the AI market. No less a luminary than Sam Altman foresees some industrial carnage on the horizon, and I wager that Jensen Huang sees it, too. Some of these AI purveyors are going to go up in smoke, and when they do, their demand for GPUs and AI accelerators will go with them. Some will prosper, and some will be afflicted by debilitating immiseration and eventually be interred in the graveyard of failed companies, analogous to the business failures that transpired during and after the bursting of the dotcom bubble. If Nvidia can help to get OpenAI over a not inconsiderable financing bump, the latter might survive and eventually thrive, becoming a permanent tentpole customer for Nvidia’s chips and other products well into the future.
Like OpenAI, Nvidia is in something of a valuation bind. It needs to keep its biggest customers alive and kicking, even as it faces competition in GPUs and AI accelerators and the impending loss of the Chinese market to domestic Chinese chip suppliers.
Still, this is a seemingly unprecedented arrangement, at least for the technology industry. It is, as the CNBC article suggested, a circular financing model, a distant relative to other resourceful investment confections. Nvidia will invest money in OpenAI, much of which will be spent to lease (if you wish) Nvidia’s next-generation AI accelerators (and perhaps other gear) well into the future.
I said this financing arrangement was new to the technology industry, but that doesn’t mean variations on the theme have not been seen before.
Analogies are comparisons are always slippery. In casting for an analogy, one can easily lose one’s footing and slide into the quagmire of mortification. The risks of a logician’s pratfall are noted, but allow me to tentatively posit that Nvidia’s arrangement with OpenAI is the tech-industry’s distant relative to the petrodollar arrangement between the U.S. and OPEC oil exporters back in the 1970s. Like those oil-exporting countries, OpenAI will get funding from Nvidia, which receives only non-controlling equity (no board represent) from its stake in OpenAI. Even so, Nvidia fully anticipates perennial returns on its investment, as OpenAI is expected to spend in near-perpetuity on AI infrastructure sourced from Nvidia.
As I thought about this money-go-round deal, the celebrated boardroom scene from the movie Network came to mind. We’re now guided by the AI industry’s corporate cosmology and its immutable bylaws of business. Money is taken out only to be put back.
Corporate Cosmology: The Immutable Bylaws of Business
That’s a great clip, still relevant after all these years, though the company names and the industries have changed significantly in the past 49 years.
Obviously, as I noted, there are differences between Nvidia’s investment in OpenAI and the petrodollar arrangement between the U.S. and oil-exporting countries such as Saudi Arabia. The petrodollar pact was designed to achieve macroeconomic and geopolitical objectives, whereas Nvidia and OpenAI are forging what is essentially a gargantuan vendor-financing relationship. What the two arrangements have in common is the principle of financial reciprocity: Money goes around in a cycle, like raiments in a giant spin dryer.
Indeed, the Nvidia-OpenAI deal is a microcosm of larger issues and questions that are just now coming into focus. The implications of this deal — and others like it, which are sure to follow — are profound. They signal a clear break before what has gone before, which provides the basis for our conventional undemanding of the industry and our respective places in it, and what is likely to follow.
The previous week, before unveiling its pact with OpenAI, Nvidia announced a $5-billion investment in Intel, a move that provides the latter with ballast while giving Nvidia a stake in Intel and a viable contract-manufacturing alternative to TSMC. The Intel investment was a more conventional gambit, both for Nvidia and for the tech industry in general. Nvidia’s flutter on OpenAI was a different beast entirely.
Supply-Side Infrastructure Will Need Demand-Side Support
The investment boffins at Barclays have done some back-of-the-envelope calculations and now estimate that more than $2 trillion of spending on AI infrastructure could be in the books by the end of the decade. Of that $2 trillion in expenditures, they forecast that approximately 65% to 70% will be allocated to compute and network infrastructure. Even then, they believe the numbers could go higher, potentially reaching $3 to $4 trillion.
These are gobsmacking numbers, and when you see such numbers, you always want to ask the forecaster about their rudiments of assumptions and methodology.
Irrespective of why the estimates have been ratcheted to nosebleed altitudes, my concern is that they seem predicated entirely on supply-side market dynamics. What I mean is that these projections are based exclusively on the projected buildout of datacenters and the infrastructure that will go into them. Little, if any, consideration has gone into forecasting accurate demand for AI applications and services that will presumably keep these datacenters running. If compelling use cases and revenue-generating enterprise and consumer demand fail to materialize, or fail to develop sufficiently to engage the capacity of these capacious AI datacenters, any market growth on the infrastructure side will decelerate. In worst-case scenarios, the growth would stop in its tracks.
You can only get high on your own supply for so long. If there is a disconnect between supplier capacity and customer demand for the services created (whether genAI bots or agentic AI or, dream of dreams, mythical AGI), the rollercoaster will be derailed and a lot of people, including investors, will get hurt.
Eventually, a market for this stuff — and a very, very big market at that — will need to develop to sustain any long-term buildout of datacenters and infrastructure on the unprecedented scale needed by the likes of Nvidia and OpenAI and projected by some Wall Street firms. It could happen, of course, but I think some caution is advised. We might be getting ahead of ourselves on the infrastructure side of the equation.
Even if commensurate demand does materialize, problems will have to be resolved, including the specter of structural limits to growth. We’ll explore some of those issues in due course.