Companies are pushing AI, but experts say it can add extra labor, cause ‘brain fry’ – CNBC

This post was originally published on this site.

image

AI is supposed to make work faster and easier. But some experts say it’s also causing unforeseen — and sometimes, overlooked — challenges for employees.

Several leaders of major companies have mandated that their employees integrate AI into their workflows, with Shopify CEO Tobias Lütke calling it a “fundamental expectation” for workers. A Sept. 2025 survey from AI Resume Builder found that 24% of companies said they require AI use across all roles, based on responses from nearly 1,300 business leaders.

But there’s a disconnect between leaders’ enthusiasm and expectations around AI and employees’ actual experiences, says Dennis Stolle, head of applied psychology at the American Psychological Association.

A January survey from AI consulting firm Section found that 74% of the C-suite reported feeling “excited” about AI, while 68% of individual contributors reported feeling “anxious or overwhelmed.”

It takes a lot of human labor and oversight to produce quality results with AI, some workers tell CNBC Make It, not to mention the time and effort employees expend learning how to use the tools in the first place. Moreover, using AI can itself create a kind of mental strain and fatigue called “brain fry,” according to a recent study.

Workers may blame themselves for struggling to implement AI, Stolle says — “Is this some failing with me? Am I not doing it right?” — but the real issue is that “we never designed the workplace for these kinds of tools.”

Right now, he says, “we’ve just foisted it on people.”

Not a ‘magic bullet’

A January survey from Workday, a human resources software platform, found that a majority of employees, 85%, said that using AI at work has saved them between one and seven hours each week. However, employees also reported that they lost 40% of those efficiency gains by having to correct, rewrite, edit or fact-check AI-generated content.

That was the case for Linda Le, a recruiter who lives in Austin, Texas. “Everyone talks about AI boosting productivity, but what they don’t mention is how much time you spend babysitting the output,” she says. “It’s definitely made some things faster, but it’s not the magic bullet people think it is.”

Le, 27, who currently works as an inbound sourcer at a workforce management software company, says that she and her fellow recruiters at a previous job used various AI models to source, screen and evaluate candidates.

While she found that AI saved time on certain tasks, such as quickly generating a list of LA-based software engineers, she estimates that she spent almost half the time she gained correcting and fact-checking the results.

They think AI is efficient, but it’s efficient because there was a person operating AI. There was a person making sure AI wasn’t making mistakes, and doing everything behind the scenes.

Linda Le

Recruiter

Sometimes the AI software would claim that a candidate’s resume matched 95% of a job description, Le says, but upon taking a closer look, she found that they were only a 30% match. Equally often, Le says she found that AI had flagged well-qualified candidates as poor matches for a job. To correct those errors, she says, “I would have to constantly go back and redo everything.”

Knowing that AI is prone to mistakes “adds another layer of anxiety for employees,” according to Stolle, and some may fear consequences from their managers if they miss an error caused by AI or a hallucination. From an employee’s perspective, “not only do I need to to manage this AI, but I’m not quite sure how much I can trust the AI,” he says.

AI models are “prone to errors,” Alphabet and Google CEO Sundar Pichai told the BBC in Nov. 2025. He advised users to fact-check their results: People “have to learn to use these tools for what they’re good at, and not blindly trust everything they say,” he said. Google’s Gemini, like other AI tools, reminds users of this in its interface; under the prompt box, it says: “Gemini is AI and can make mistakes.”

Le feels that employers often overlook the amount of extra work workers must do to produce high-quality results with AI. “They think AI is efficient, but it’s efficient because there was a person operating AI,” she says. “There was a person making sure AI wasn’t making mistakes, and doing everything behind the scenes.”

Learning AI ‘on your own time’

AI training is another major time expenditure for workers.

Devin Boudreaux, a digital PR strategist based in Boise, Idaho, says he’s spent the past year and a half training several custom AI models to help with different aspects of his job, such as generating campaign ideas or writing survey questions. Using AI saves him time overall, Boudreaux says, but the initial output from his models typically isn’t “that great.” To produce better results, you have to “constantly tell it what it’s doing right, what it’s doing wrong.”

Before he could train AI to assist him, Boudreaux first had to teach himself how to work with the models. At a previous job, Boudreaux, 38, says he felt pressure from higher-ups to use AI, but the company didn’t provide training sessions or set aside time for workers to familiarize themselves with the tools. Instead, a friend taught Boudreaux most of his AI skills outside of work.

In Boudreaux’s view, companies are urging employees to “become AI proficient,” he says, but “they want you to do it on your own time.” The Section survey found that only 27% of individual contributors said they received company AI training, and just 32% reported “clear access” to AI tools.

Employers’ expectations around AI can create an unsustainable cycle for workers, according to Stolle: “Employees are as busy as they can possibly be during the work day, and then they’re feeling like they need to spend their evenings” learning about AI so that they “can keep up and make the next day even more busy.”

People are juggling multiple different tools and outputs, and they feel like if they stop paying attention to any one of them, something is going to drop.

Dennis Stolle

Head of applied psychology at the American Psychological Association

It’s unsurprising that some workers are feeling overwhelmed by the demands of learning how to use AI, according to Ben Smytheman, a chartered psychologist and senior vice president at LHH, a talent and human resources services company. Right now, “there’s a radical amount of technology evolving every day,” and adapting to these new tools “is going to burden us as human beings,” he says.

The most important factor in successfully implementing these tools is that employers don’t place too much pressure on workers to deliver “immediate productivity,” Smytheman says. Workers aren’t going to be able to master AI when they’re “overburdened or overstretched,” he says, or when they’re “put into a scenario that they’re not prepared or upskilled for.”

The ‘brain fry’ phenomenon

Employees are feeling the strain of incorporating AI into their work. A recent study from Boston Consulting Group found that workers who frequently use AI experience an increase in mental fatigue, a phenomenon they termed “AI brain fry.” Those workers are more likely to make mistakes, feel overwhelmed or mentally foggy and struggle to make decisions.

One of the main factors causing workers to develop “brain fry” is having to oversee multiple AI tools at once, according to Julie Bedard, a managing director at BCG and a coauthor of the study.

Workers who use three or more AI agents in their workflow are more likely to suffer negative effects like mental fog than those who use one or two, the study found, and tasks that involve high levels of oversight require 14% more mental effort and cause a 12% increase in mental fatigue.

Stolle compares the effects of using multiple AI tools to “spinning plates”: “People are juggling multiple different tools and outputs, and they feel like if they stop paying attention to any one of them, something is going to drop,” he says. “It builds up that kind of background anxiety that something is going to fall through the cracks.”

Even some AI leaders say they’re feeling the oversight-related strain, though they may not be using the term “brain fry.” “I end each day exhausted — not from the work itself, but from the managing of the work,” Francesco Bonacci, founder of Cua AI, wrote on X in February. Ben Wigler, co-founder of LoveMind AI, told the French news agency AFP that using AI causes a “brand-new kind of cognitive load” stemming from the need to “babysit” models.

Another major contributor to “brain fry” is that AI often increases the depth and breadth of employees’ workloads, Bedard says.

Researchers at The University of California, Berkeley found that employees who used AI tools “worked at a faster pace, took on a broader scope of tasks, and extended work into more hours of the day, often without being asked to do so,” based on observations and interviews with employees of a 200-person, U.S.-based tech company.

As AI’s capabilities grow, Boudreaux says that he feels pressure to increase his output accordingly. No one is saying you have to use AI at his current company, he says, but in his view, it would be difficult to hit their current success metrics without it.

Looking ahead

To Stolle, the “brain fry” effect indicates a “sustainability problem” in the way that workplaces are introducing AI. Employers should keep in mind that it will take time to figure out how to best implement and integrate AI, he says.

Executives and senior leaders are responsible for “setting the culture” around AI in their organizations, Bedard says, and they need to be “purposeful” about how they’re asking their teams to deploy those tools.

She says it’s important for employers to identify the scenarios in which AI helps employees more than it burdens them. For instance, Bedard notes that workers in the BCG study she coauthored reported less burnout when they used AI to replace “routine or repetitive tasks.”

AI technology is “evolving at a really fast pace,” Bedard says, so it’s difficult to predict how these challenges will evolve — and to determine, at this stage, how to solve them.

Ultimately, the goal of implementing AI at a company “should not be to just use more AI,” Stolle says. “The goal has to be creating higher-quality work in a sustainable kind of way, so that when you’re five years out, you still have a thriving organization filled with happy employees.”

Want to lead with confidence and bring out the best in your team? Take CNBC’s new online course, How To Be A Standout Leader. Expert instructors share practical strategies to help you build trust, communicate clearly and motivate other people to do their best work. Sign up today!

Take control of your money with CNBC Select

CNBC Select is editorially independent and may earn a commission from affiliate partners on links.

Leave a Reply

Your email address will not be published. Required fields are marked *