Some jobs are at risk of being swamped by AI, while others are more likely to be restructured to take greater advantage of AI tools and platforms. The question is, which jobs are more vulnerable? The metric employed to date – exposure to AI – may not have the right answers.
That’s the gist of a recent analysis published by Ronnie Chatterji, chief economist at OpenAI, proposes a framework to better understand and prepare for AI’s eventual impact on the labor market. “While AI capabilities are advancing very quickly, businesses, institutions, and labor markets take time to adjust,” the report states. “That lag means we must avoid two kinds of error: overstating immediate disruption and understating long-run impact.”
The report examines AI impact across more than 900 occupations covering 153.7 million jobs. About 18% were at near-term high automation risk, the analysis showed. About 25% of jobs have high exposure and strong human necessity and will reorganize. 46% of jobs have less immediate change imminent.
Another 12% of jobs could grow because of AI, as lower effective cost may increase utilization, affordability, access, or quality-adjusted output.
The framework asks four questions:
- “Can AI do a meaningful share of the work?
- “If AI lowers the effective cost of providing the service, is demand likely to expand enough to absorb the productivity gain?
- “If it can, for remaining tasks, is a person still central to the work’s delivery, judgment, accountability, or physical execution?
- “Is AI already being used meaningfully for these tasks?”
The framework is built on demand elasticity, or “how much demand changes when price changes — is what connects productivity to employment.” Thus, if AI makes the cost of providing a good or service cheaper, “the effect on employment in related occupations is ambiguous. When goods become cheaper, people often buy more of them, sometimes leading to an increase in employment in affected sectors.”
Using this formula, the “least-elastic” occupations include firefighters and home health aides, the analysis shows. “Somewhat-elastic” occupations include physical therapists, editors, and dental hygienists. Some of the “most-elastic” occupations include graphic designers and software developers.”
Overall, in those jobs with the highest automation potential, AI could do about 90% of the tasks, though actual usage is only at about 24% at this time, OpenAI estimates.
OpenAI’s economist admits that predictions about AI job displacement are not an exact science, and it ma, based on mainly technical formulations, may not stand up when they encounter organizational culture and dynamics on the ground, however. OpenAI calls the gap between automation potential and actual usage “a result of institutional friction and adoption lag,” said Darlene Newman, managing partner of Ivy Captech Advisors, in a LinkedIn post. “I’d call it something else. It’s debt – knowledge debt.”
No matter how advanced or modern the platform being adopted, issues arise with ungoverned or limited data, “and the intelligence layer living in spreadsheets,” she explained. “That’s an organizational knowledge problem. And it exists everywhere. You see it most clearly during major incidents. It can take twenty or more people hours just to understand what a system actually did and why.”
Complicating things is the “system itself is opaque, even to the organization that built it. Rules were encoded by people who are no longer there. Logic exists that no one can fully explain. Processes were never written down because they lived in someone’s head. Now point an agent at that environment.”
A large language model in such an environment “isn’t reasoning from a clean knowledge base,” Newman cautions. “It’s pattern-matching against ambiguity. And unlike the humans on that incident call, who know they don’t know, the model doesn’t flag its own uncertainty. This is why people will remain in the loop longer than most projections suggest.”
Launching AI on top of an “unresolved knowledge debt doesn’t prove readiness,” she said. “It just makes the debt harder to see – until it surfaces in an output nobody can audit or stand behind.”