Why Tech Employment is Not What It Used to Be

When major new technologies arise, the expectation is that prosperity will follow. What’s more, most observers reasonably assume that the wealth will be shared, not evenly – because that never happens – but at least broadly.

For the most part, these expectations and assumption were realized in past information-technology booms, including the mainframe and minicomputer era, the client-server period, and the dotcom boom that followed the rise of the commercial Internet and the advent of the World Wide Web (WWW, now colloquially and casually referred to as the web. After the first instantiation of the web, with its inelegant and relatively static websites and its rudimentary ecommerce, came the rise of what was called Web 2.0, which set the stage for cloud preeminence. 

As the technology industry transitioned from each era of prosperity to its successor, aggregate wealth grew. While some mainframe vendors and minicomputer vendors perished at the dextrously lethal hands of the client-server uprising, many industries and enterprises continued to use mainframes, some of the which remain in operation today, still supporting processes and jobs that were defined decades earlier. The same pattern applied to the transition from client-server to the web and the dotcom era, in which many enterprises continued to invest in the ancien régime while hanging a new shingle on the web. 

As each new period succeeded the one before, more opportunity and wealth proliferated. It was not a zero-sum game, though clearly greater growth and vitality gradually accrued to the succeeding market. New riches were bestowed by each successive wave, but that didn’t mean that what had gone before was obliterated. Even as new markets took shape and grew, enterprises continued to make investments in prior generations of technology, processes, and people, all of which were integral to various organizational functions and processes. Technologies got old and sometimes infirm, but they took a long time to die. 

That was then, this is now. 

We have entered the comparatively mature years of the cloud era. That doesn’t mean the cloud isn’t experiencing growth. The growth continues, though the law of big numbers means the growth rate merely impresses rather than staggers. Still, the cloud is not a novelty now, far from it. Its once-glistening shine has dulled, replaced by a familiar matte finish.

Yes, we also have the relative newcomer genAI. But really, is genAI revolutionizing anything right now? GenAI emits a lot of sound, I grant you; it barks with a full-throated fury. If you can disregard the circus antics, however, and instead inspect the commercial takings, genAI will be revealed as incurring enormous investments while delivering a paucity of financial returns and business value to those who provide and consume its services.

Oh, the returns will come, say the bearers of AI picks and shovels and the purveyors of genAI services. Perhaps they will come, too, but any objective observer today must conclude, after an assiduous review of the available evidence, that genAI’s financial returns and rewards have yet to make a meaningful appearance. There’s been plenty of digging with picks and shovels, on both consumer and business terrain, but can we really say that rich veins of genAI gold have been commercialized?

Costs Exceed Returns

Instead, what we see before us are a lot of parlor tricks and an abundance of meretricious kitsch in the store windows, but how much value would you put on it? How much are you willing to pay for it, as a consumer or as a business? 

Even as we ponder those questions of value creation and realization, with no resolution in sight, the cloud giants continue to spend vast sums to develop, build, deploy, and provide genAI services. The spending on datacenters, IT infrastructure, large-language models, oceans of pertinent data, and qualified personnel is inordinately high, even before accounting for the daunting cost of energy to fuel the datacenters. 

Therein lies the crux of the matter. If the returns on investment are not at a point where they surpass the costs, then something has to give, right? At some point, investors will demand proof that this stuff will not only pay for itself, but will deliver the riches that were forecast so histrionically when genAI was heralded paradoxically as both the greatest technological advance in the human history and the greatest existential threat to human existence that technology had ever produced. 

Major investors aren’t yet storming the boardrooms of the cloud giants demanding fiscal rectitude on all matters related to AI. Patience is ebbing, sure, but some sand remains in the hourglass. Perhaps investor forbearance will pay off, if not pay dividends, and the returns on genAI investment will widen and intensify from trickle to torrent. At that point, all concerned parties – the genAI infrastructure suppliers, the model developers, the real-estate concerns, the energy providers, the cloud giants, and their associated workforces – will be contented, if not necessarily happy (because happiness, in this world, is really hard to sustain indefinitely). 

In the meantime, though, something odd is happening. The technology industry, which has perennially spun attractively remunerated jobs as prolifically as mutant spiders might spin webs, has lost its job-creating mojo. It’s not only the relative and absolute number of jobs that seem to be declining, but the quality and compensation associated with the jobs that are available. 

A couple weeks ago, I read Wall Street Journal article indicating that the unemployment rate for U.S. information-technology workers reached 6% in August, up from 5.6% in July. The WSJ attributed the shift to “a boom in artificial intelligence” that continues to “drastically alter the tech landscape.” There’s more to the picture, of course, and I'll get to it shortly.

Travails of Tech Workers 

Interestingly, the article, citing data from Janco Associates and the U.S. Bureau of Labor Statistics, indicated that the unemployment rate among U.S. tech workers was higher than the aggregate national jobless rate, which stood at 4.2% in August. Here’s a relevant excerpt:

In August, there were 148,000 unemployed IT workers, more than the 145,000 in July, according to consulting firm Janco Associates, which based its findings on data from the U.S. Department of Labor. The IT unemployment rate has been above the national jobless rate for seven of the last eight months, Janco found. On Friday, the Bureau of Labor Statistics said the national jobless rate ticked down to 4.2% in August as the economy added 142,000 jobs.

Attempting to diagnose the tech-industry malaise, the article suggested that “part of the difficulty recently laid-off IT workers now face is a disconnect between the skills they have and how much they expect to be paid.” Job postings for software development and IT support were reported to have decreased in number, even when compared to a benchmark that preceded the pandemic. Moreover, tech is experiencing slower pay growth, according to wage data from Indeed.  

Yesterday the Wall Street Journal published a subsequent article revealing that many technology jobs have dried up and are not expected to make an imminent return. From that article:

Once heavily wooed and fought over by companies, tech talent is now wrestling for scarcer positions. The stark reversal of fortunes for a group long in the driver’s seat signals more than temporary discomfort. It’s a reset in an industry that is fundamentally readjusting its labor needs and pushing some workers out.
Postings for software development jobs are down more than 30% since February 2020, according to Indeed.com. Industry layoffs have continued this year with tech companies shedding around 137,000 jobs since January, according to Layoffs.fyi. Many tech workers, too young to have endured the dot-com bubble burst in the early 2000s, now face for the first time what it’s like to hustle to find work. 
 
Company strategies are also shifting. Instead of growth at all costs and investment in moonshot projects, tech firms have become laser focused on revenue-generating products and services. They have pulled back on entry-level hires, cut recruiting teams and jettisoned projects and jobs in areas that weren’t huge moneymakers, including virtual reality and devices. 
At the same time, they started putting enormous resources into AI. The release of ChatGPT in late 2022 offered a glimpse into generative AI’s ability to create humanlike content and potentially transform industries. It ignited a frenzy of investment and a race to build the most advanced AI systems. Workers with expertise in the field are among the few strong categories. 

 

2024: Perhaps Not a Great Vintage for IT’s Foot Soldiers 

At the risk of solipsism and shameless self-indulgence, allow me to quote from my own post earlier this year:

As discussed previously, the frequency and number of tech layoffs at the start of this year strongly suggested that 2024 might see as many job cuts, and perhaps more, than the preceding year. If you’ll recall, Reuters reported that the tech sector shed 168,032 jobs in 2023, achieving the dubious distinction of topping the leader board of industries with the highest number of layoffs, according to a report issued by Challenger, Gray and Christmas. The staffing cull was led by Alphabet (Google), Microsoft, Amazon, and Meta, who combined to jettison tens of thousands of employees.
Wall Street Journal article, published yesterday, provides further confirmation that 2024 might not provide anxious employees with a respite from human-resource departments’ grim reapers. It remains to be seen whether 2024 will surpass 2023’s tech-layoff numbers, but current evidence and the general market trend suggest that the possibility could become a probability, and perhaps even a certainty, before we reach the fourth quarter.  

I quoted myself, but mercifully not in third person, for a reason, and here it is: If a whopping, inordinately high 168,032 jobs were slashed from the technology industry last year, according to the Reuters report cited above, and we’re already at job losses of more than 137,000 as of the end of August, I’d say there’s a high degree of probability that the industry will shed even more jobs this year than last year. Admittedly, I’m not exactly going out on a limb here.

The latest WSJ article also mentions the otherworldly sequence of events, during COVID, that led first to widespread overstaffing in tech companies, followed by a subsequent contraction and extensive layoffs. There’s no question about the whipsaw effect triggered by the pandemic and its aftermath, but, as I see it, it’s not that relevant to what’s happening now, at a time when even tech’s most profitable companies are jettisoning employees. 

Indeed, the WSJ article makes the following observation regarding a trend that preceded the pandemic-related events by more than two years:

The payroll services company ADP started tracking employment for software developers among its customers in January 2018, observing a steady climb until it hit a peak in October 2019. 
The surge of hiring during the pandemic slowed the overall downward trend but didn’t reverse it, according to Nela Richardson, head of ADP Research. One of the causes is the natural trajectory of an industry grounded in innovation. “You’re not breaking as much new ground in terms of the digital space as earlier time periods,” she says, adding that increasingly, “There’s a tech solution instead of just always a person solution.” 

Indeed, but it seems genAI, flush with cash and prioritized by boards and CEOs, still needs people to create it, to get it up and running, and to maintain it. As the article explains, prospective employees with experience and proficiency in large language models (LLMs) are candidates for jobs that pay more than $1 million annually. AI engineers are reputedly offered salaries two to four times greater than those of a “regular engineer,” an ambiguous designation that goes undefined in the article. A point that is made cogently, however, is that the extravagant spending on genAI talent is putting a squeeze on the availability of budgets and resources that were allocated to other IT technologies and projects. 

Behold the New, Old Austerity 

 The reallocation of resources and investment toward genAI and away from preexisting technology initiatives is only part of the story, though. It’s also true that the technology industry, partly as a result of its continual advances in harnessing and manipulating data, is under mounting pressure to achieve ever-greater personnel productivity and business efficiencies, which take the form of leaner, meaner organizational structures and management hierarchies. 

To be sure, in the earlier Wall Street Journal article, in a discussion of why tech companies are shedding employees and reducing costs, we find the following paragraph:

The cuts aren’t only aimed at helping major tech firms shift toward generative AI, but are also part of a continued focus on efficiency and profitability. 

There you have it. 

Well before the rise of genAI, data-driven efficiencies at tech companies were prescribed by management consultants. Since then, similar cost-cutting and productivity-enhancing initiatives have been espoused wholesale by private equity, investment banks, venture capitalists, and institutional investors. All of them use analytics and metrics to closely and continuously scrutinize their investments and properties. The purpose of such rigorous oversight is identify and eradicate all instances of inefficiency and perceived waste. This isn’t a new development by any means, but it has become standard industry practice, tantamount to a hallowed precept in the canon of modern corporate-management doctrine.

Private equity has always been a fervent believer in data-based business efficiencies, but venture capitalists now have the fanatical zeal of the recent convert, which is why you’ve seen and heard so many of them decry waste and superfluity within the ranks not only of their portfolio companies but also at the industry largest companies, whom the VCs view as exemplars that should set a laudable model for optimized business practices of startups and established companies alike.

Should we be inclined to view this development as the inevitable result of the industry’s maturation? 

The once-youthful rebels of the technology industry, formerly proponents of moving fast and breaking things and of failing fast and frequently, have aged into spreadsheet jockeys and rigorous data analysts, endlessly measuring all the business metrics that matter and some that don’t. The industry is not only getting older in its technical longevity, in how long it’s been around, but also in relation to the people who lead it, with most of the CEOs and CFOs at the cloud giants well into middle age. For these now-stolid leaders, the emphasis is firmly on the concentration and consolidation of power and wealth by their companies, some of which have attained trillion-dollar valuations. At such rarefied heights, it’s understandable perhaps that a staid business conservatism should assert itself, even as the industry eternally proclaims a romanticized vision of youthful idealism. 

Technology has been big business for a while, but it’s a bigger business now. When there’s more on the line – more money, more responsibility, more pressure from concerned and concerning stakeholders – the industry’s leaders feel compelled to demonstrate a sternness and seriousness of purpose that perhaps wasn’t as necessary, or frankly as beneficial, during the swashbuckling years. 

In boardrooms throughout the tech firmament, the furrowed brow is replacing the impish grin. We should expect the numbers, including those relating to employee head count, to continue to be scrutinized diligently and austerely, no matter what comes of genAI. 

Subscribe to Crepuscular Circus

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe