Presenting a Tapas Menu of Recent Tech News

Each week, we’re swamped by events and information. It’s hard to keep up. We do our best to remain apprised of what matters to us, but relevant developments can evade the notice of even the most diligent observer. 

What follows are news items that caught my attention but might have escaped yours. While I found these morsels intriguing, I wasn’t inclined to transform any single one of them into a full-course feast. Instead, I will provide them as snack-size appetizers, replete with succinct commentary and interpretations. 

Let’s begin. (Yes, that’s a bit redundant, but indulge me just this once.) 

With the rise of AI, on-premises datacenter revanchists have crept furtively from the shadows for another assault on the bastions of cloud computing. These on-premises insurgents argue that AI somehow refutes the business logic that powered migration of virtualized and containerized workloads to cloud infrastructure during the past decade. 

In a piece at VentureBeatJames Thomason evaluates an analysis from Sid Premkumar, founder of AI startup Lytix, who posited that running an open-source AI model in an on-premises datacenter could be considerably less expensive than using Amazon Web Services (AWS) for the same purpose. After reviewing Premkumar’s methodology and noting a few significant factors of omission, Thomason concludes his critique with the following summary:

Just as in “The Great Cloud Wars,” the cloud is already poised to emerge victorious in the battle for AI infrastructure dominance. It’s just a matter of time. While self-hosting AI models may appear cost-effective on the surface, as Premkumar’s analysis suggests, the true costs and risks of on-premises AI infrastructure are far greater than meets the eye. The cloud’s unparalleled advantages, combined with the emergence of privacy-preserving AI services, make it the clear winner in the AI infrastructure debate. As businesses navigate the exciting but uncertain waters of the AI revolution, betting on the cloud is still the surest path to success.

Rooting interests aside, Thomason’s logic and reasoning are consistent and persuasive. Just as in the pre-AI period, some organizations will have cogent rationales and sufficient in-house capabilities to run new workloads in on-premises datacenters, but they appear destined – yet gain – to be among the minority. 

Continuously running an AI environment is not a trivial undertaking, and it’s likely to become more exacting as the technology gains (perhaps more rapidly than we suspect) in sophistication. Budget demands alone, especially given the prices of processing systems and the attendant costs of electricity, are enough to give any CFO pause, especially given that ROIs and business cases remain unsteady works in progress. That’s why the path of least resistance, and faster results, leads most readily to the cloud. 

 Global Datacenter Blitz 

 In anticipation of growing demand for cloud services, including AI, the hyperscalers continue to make multi-billion-dollar investments in new datacenters. Reuters reports that Microsoft plans to invest 6.69 billion euros ($7.16 billion) to develop new datacenters in Spain's northeastern region of Aragon, which has emerged as a hot spot (so to speak) for cloud computing in Europe. The investment will span a decade and follows a recent Microsoft announcement of a 2.1-billion-euro investment in datacenters in Madrid. In announcing its datacenter foray in Aragon, Microsoft is following the lead of AWS, which said last month that it would spend 15.7-billion euros during the next decade to build renewable-energy-powered datacenters in the region.  

It isn’t only places with ample renewable energy that chosen as destinations for future datacenters. As recounted by the Wall Street Journal, hyperscalers are maintaining, if not accelerating, a torrid pace of datacenter buildouts. While the article in question nominally focused on Amazon’s investments in new datacenters in Taiwan, the author also mentioned AWS’s $9 billion in datacenter expenditures in Singapore, another $15 billion earmarked for cloud capacity in Japan, and more than $5 billion apportioned for datacenters in both Mexico and Saudi Arabia. The prior year, according to the same WSJ article, AWS committed to spend almost $13 billion through 2030 to expand datacenter infrastructure in India, while Microsoft went on a datacenter splurge in Southeast Asia and Google’s sunk $2 billion into cloud investments in Malaysia. 

Earlier this month, as reported by Reuters, Microsoft announced a $3.2-billion investment to expand its cloud and AI infrastructure in Sweden spanning a two-year period. That’s just a partial glimpse into the datacenter arms race pursued by the world’s largest cloud hyperscalers. Every few weeks, a diligent reader would be able to add to the tally of announcements and buildouts. A billion here and a billion there, pretty soon we’re talking about serious money. 

There are two at least two salient takeaways here: First, cloud computing remains a capital-intensive contest involving the pursuit of seemingly endless scale, with table stakes that prohibit participation from anybody other than a high roller; second, it seems increasingly apparent that regions possessing renewable energy, including cost-effective off-grid electricity sources, will, given AI’s ravenous appetite for electricity, attract more than their fair share of forthcoming datacenter buildouts. 

Paradoxically, Google and Microsoft are among the hyperscalers culling staff in their cloud units, as we’ve discussed here recently. Perhaps the industry giants will soon hire new employees to replace the jettisoned; or maybe the hyperscalers will tap intelligent automation, via AI, to facilitate – as the old chestnut goes – “doing more with less.” There’s no question that many major investors, perceiving AI as a labor-saving (eliminating?) device, are pushing for heightened productivity resulting in lower headcounts. 

Industries Pan for AI Gold 

Multiple fields and industries are kicking AI’s tires, trying to determine how much they want to want to spend on the new technology and which models will provide the greatest range and mileage. Agence France Press, via EnergyWorldreports on the oil and gas industry’s dalliance with AI, which is the next logical step in the industry’s current use of data analytics and digital twins, the latter particularly evident in downstream operations such as refineries. 

Like other industries, the fossil-fuel camp is attracted by AI’s potential cost savings, though oil and gas concerns are also intrigued by how the technology might be used to mitigate industrial accidents and lower greenhouse-gas emissions, an inevitable and inherent byproduct of the industry’s raison d'etre.

Members of the industry ecosystem believe that AI could significantly reduce the time it takes for new employees to become proficient and to safely operate facilities at scale. What that suggests, I suspect, is that the industry believes it might be able to hire a greater number of entry-level employees while dispensing with some higher-salaried senior staff members. Nobody in the industry would say that publicly, of course, but doesn’t mean they’re not thinking it. Regarding the mooted potential to shrink the industry’s carbon footprint, the article, to its credit, does concede that AI requires “huge amounts of electricity, mainly in datacenters.” AI giveth and AI taketh away. 

Cyber insurers are also optimistic about AI’s potential to give their businesses a boost, according to a VentureBeat article. Insurance purveyors offer policies that cover an extensive array of digital malefactions, including ransomware, phishing, and privileged-access credential attacks, many of which result in prohibitively expensive insurance premiums for companies that represent prospective insurance clients. 

The hit parade of cyber-insurance claims in early 2024 was led by ransomware attacks, followed by supply-chain exploits and a golden oldie, business email compromise (BEC), common and persistent enough to be honored by an acronym all its own. Supply-chain attacks are conspicuously on the rise, however, incurring business costs of $46 billion last year. 

How can AI and LLMs help the cyber insurance industry? The gist is that AI might allow prospective insurance clients to get gain a clearer understanding of their risk profile while reducing their exposure to threats, enabling them to qualify for policies with affordable premiums. In making themselves insurable at lower premiums, these companies would expand the pool of clients open to insurers. I believe we still refer to this phenomenon, sometimes unironically, as a “win-win scenario.” Predictive AI is also viewed as delivering greater stability to cyber insurers by reducing customer risk profiles and similarly decreasing the likelihood of simultaneous, large-scale cyber carnage. Another potential benefit attributable to AI is the capacity of its predictive technologies to result in fewer application-policy rejections as customers gain greater insights into risk exposure and implement protective measures. 

We’ll continue to see a proliferation of articles on prospective applications of AI across a range of industries. The promise will be tantalizing, but most enterprises will likely be content to learn from the experiences, good and bad, of the earliest adopters. Like any other market, AI will have early adopters, followed by an early mainstream, a mainstream, and then a long tail that includes much-maligned laggards, characterized invariably by inveterate risk aversion and occasionally by stinginess. 

Spare a Thought for Indigent CEOs

 In other news, the average annual compensation for a CEO at an S&P 500 company rose nearly 40 percent from 2017 through 2023, reaching $16.3 million last year, according to an AFP article quoting a study from consulting firm Equilar. During the same period, the average U.S. worker received a 27% boost in remuneration. An Equilar press release lists Broadcom’s Hock Tan as the highest-paid CEO of 2023. Tan collected total compensation of $161.8 million in 2023, outdistancing the comparatively immiserated Will J. Lansing of Fair Isaac Corporation ($66.3 million) and Tim Cook of Apple ($63.2) million. How do Lansing and Cook make ends meet? Maybe Tan, embracing the ethos of the gig economy, should consider a side hustle as an agent. 

If Elon Musk gets his way, now that Tesla’s shareholders have endorsed his imperial compensation plan, the Equilar leader board could look radically different next year. Much to his pique, however, Elon’s deliverance from poverty remains subject to the discretion of the courts rather than to the largesse of Tesla shareholders. 

Subscribe to Crepuscular Circus

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe