Professor Jan Liphardt recently told the Los Angeles Times, Stanford CS graduates “are struggling to find entry-level jobs” in “a dramatic reversal from three years ago.” This year’s graduating class is facing one of the most difficult recruiting seasons in modern history, and the widespread consensus seems to be that AI is largely to blame. The timing seems too tightly correlated to be a coincidence, with AI adoption surging right as entry-level hiring craters. However, a deeper investigation suggests that AI is a convenient scapegoat, and the lack of hiring is actually being driven by several more transient forces, which should give us reason to be optimistic.
The fact that the job market is brutal is not news to anyone who has sent out hundreds of applications only to receive the occasional response inviting them to a several-hour-long skills assessment, or the first of half a dozen interviews. Recent graduate unemployment hit 5.7% in Q4 2025, worse than at any point during the 2008 financial crisis. Software development postings are down sharply from their peak and remain well below pre-pandemic levels. Handshake reports that the average internship now attracts nearly twice as many applicants as the year before, with tech internships drawing 273 applications per posting. Even outside tech, the picture is bleak: Morgan Stanley’s intern program has seen its acceptance rate fall from 2.1% to 0.4%.
And yet the research keeps coming back empty-handed. A Federal Reserve study published just last month analyzed data from more than a million firms and found no evidence linking AI adoption to reduced job postings. The authors called their results “precisely-estimated null effects” and concluded that the national slowdown “does not appear to be driven (even modestly) by AI.” Daron Acemoglu, David Autor, and other leading labor economists have argued that despite firm-level evidence of AI substitution, the aggregate effects on employment and wages have not shown up in the macroeconomic data. A 2025 NBER paper studying 25,000 workers across 7,000 workplaces found precisely zero effect on earnings or hours in any occupation, and even replicated the decline in early-career employment, showing it was not driven by firms actually adopting AI.
All this begs the question: if AI isn’t the driving factor, what is, and why haven’t we heard about it?
The real drivers are much more financial than technological. In March 2020, the Fed cut interest rates to nearly zero and held them there for two years. There was no cost to borrow money, and companies spent accordingly. Meta grew from about 45,000 to over 86,000 employees in three years. Alphabet surged from 119,000 to over 190,000. Big tech, as a whole, hired employees by the millions. This period was followed by the fastest tightening cycle in 40 years, subjecting every hire to a much higher bar.
Compounding this, a 2022 tax change (originally passed in 2017 as part of the Tax Cuts and Jobs Act) dramatically raised the tax burden on R&D salaries. Companies that used to deduct a software developer’s entire salary in year one now have to amortize it over five years, which effectively creates a steep increase in the after-tax cost of each developer hired.
What might be more pernicious is why so many of us have come to blame AI in the first place. A Resume.org survey of 1,000 hiring managers found that 59% of companies admit to emphasizing AI’s role in layoffs because “it plays better with stakeholders than citing financial constraints.” Since AI-related stocks have accounted for the majority of S&P 500 gains since November 2022, announcing layoffs in the name of “AI transformation” is less a reflection of reality than a stock-price strategy.
Some of the more outspoken voices in Silicon Valley have not been shy about saying so. Marc Andreessen called AI a “silver-bullet excuse” for layoffs driven by excessive pandemic-era hiring. Salesforce CEO Marc Benioff said blaming AI is “the lazy way out” for CEOs. Even Sam Altman, who has every incentive to overstate AI’s power, called the practice of falsely attributing layoffs “AI washing.”
The current moment has striking parallels to the last time this happened after the dot-com crash in 2001. The popular story blamed internet overhype and offshoring for entry-level hiring freezes, but as Stanford’s Professor Eric Roberts documented, “there was no evidence to justify those fears, and ample data to refute them.” He warned that “mythology kept students out of computer science until disaster struck in a different sector of the economy,” which sounds uncomfortably familiar. Roberts noted that after rates came down, the industry was hiring at pre-crash levels by 2004.
The similarities here seem eerily close. Steep rate hikes, an entry-level hiring freeze, a convenient technological scapegoat, and a quick recovery once rates come down. AI may well prove transformative, but it is not what is keeping the Class of 2026 from getting hired.