This post was originally published on this site.
In summary
The more students turn to chatbots, the fewer chances they have to develop real-life relationships that can lead to jobs and later success.
During the pandemic, longtime Bay Area college and career counselor Jon Siapno started developing a chatbot that could answer high schoolers’ questions about their future education options. He was using IBM’s question-answering precursor to ChatGPT, Watson, but when generative artificial intelligence became accessible, he knew it was a game-changer.
“I thought it would take us maybe two years to build out the questions and answers,” Siapno said. “Back then you had to prewrite everything.”
An AI-powered chatbot trained on information about college and careers and designed to mimic human speech meant students at the Making Waves Academy charter school in the East Bay city of Richmond could soon text an AI Copilot to chat about their futures. The idea was that students could get basic questions out of the way — at any hour — before meeting with counselors like Siapno for more targeted conversations.
Almost one-quarter of U.S. schools don’t have a single counselor, according to the latest federal data, from the 2021-22 school year. California high schools fare better, but the state’s student-to-counselor ratio when ChatGPT debuted the following year was still 464-to-1, a far cry from the American School Counselor Association’s recommended ratio of 250-to-1.
Siapno wasn’t the only one to see generative AI’s potential to scale advising. A flood of bots designed to help people navigate their college and career options have surfaced over the last two years, often with human-sounding names like Ava, Kelly, Oli, Ethan and Coco. It’s unclear how many California high schools tell students to use any of them, but the power of generative AI and the scale at which young people are already turning to chatbots in their personal lives is giving some people pause.
Julia Freeland Fisher is education director at the Clayton Christensen Institute, a nonprofit research organization that studies innovation. She recently sounded the alarm about the consequences of letting students develop relationships with AI-powered college and career counselors instead of human ones.
“It’s so tempting to see these bots as cursory,” Freeland Fisher said. “‘They’re not threatening real relationships.’ ‘These are just one-off chats.’ But we know from sociology that these one-off chats are actually big opportunities.”
Sociologists talk about “social capital” as the connections between people that facilitate their success. Among those connections, we have “strong ties” in close friends, family and coworkers who give us routine support, and “weak ties” in acquaintances we see less regularly. For a long time, people thought weak ties were less important, but in 1973 Stanford sociologist Mark Granovetter wrote about “the strength of weak ties” and a flood of studies since then have confirmed how important those more distant acquaintances can be for everything from job searches to emotional support.
As California considers regulating AI companions for young people, policymakers, tech companies and schools must consider how the burgeoning market for AI-driven college and career guidance could inadvertently become the source of a new problem.
“We’re creating this army of self-help bots to help students make their way through school and toward jobs,” Freeland Fisher said, “but those very same bots may be eroding the kinds of network-building opportunities that help students break into those jobs eventually.”
‘Like a mentor in your pocket’
The Making Waves Academy ensures all its graduates meet minimum admissions requirements to California’s four-year public colleges. Nine out of 10 of them do pursue higher education, and while there, staff at the Making Waves Education Foundation offer 1:1 coaching, scholarships, budget planning and career planning to help them graduate on time with no debt and a job offer.
Patrick O’Donnell, CEO of Making Waves, said his team has been thinking about how to scale the kinds of supports they offer for years now, given the scarcity of counselors in schools.
“Even if counselors wanted to make sure they were supporting students to explore their college and career options, it’s almost impossible to do and provide really personalized guidance,” O’Donnell said.
Early superusers of the Making Waves AI CoPilot were 9th and 10th graders hungry for information but boxed out of meetings with school counselors focused on helping seniors plan their next steps.
CareerVillage is another California nonprofit focused on scaling good college and career advice. CareerVillage.org has been aggregating crowd-sourced questions and expert answers since 2011 to help people navigate the path to a good career.
When ChatGPT came out, co-founder and executive director Jared Chung saw the potential immediately. By the summer of 2023, his team had a full version of their AI Career Coach to pilot, thanks to help from 20 other nonprofits and educational institutions. Now “Coach” is available to individuals for free online, and high schools and colleges around the country are starting to embed it into their own advising.
At the University of Florida College of Nursing, a more specialized version of Coach, “Coach for Nurses,” gives users round-the-clock career exploration support. Shakira Henderson, dean of the college, said Coach is “a valuable supplement” to the college’s other career advising.
Coach for Nurses personalizes its conversation and advice based on a user’s career stage, interests and goals. It is loaded with geographically specific, current labor market information so people can ask questions about earnings in a specific job, in a specific county, for example. Coach can also talk people through simulated nursing scenarios, and it’s loaded with chat-based activities and quizzes that can help them explore different career paths.
Henderson is clear on the tool’s limitations, though: “AI cannot fully replace the nuanced, empathetic guidance provided by human mentors and career advisors,” she said. People can assess an aspiring nurse’s soft skills, help them think about the type of hospital they’d like most or the work environment in which they’d thrive. “A human advisor working with that student will be able to identify and connect more than an AI tool,” she said.
Of course, that requires students to have human advisors available to them. Marcus Strother, executive director of MENTOR California, a nonprofit supporting mentoring programs across the state, said Coach is worlds better than nothing.
“Most of our young people, particularly young people of color in low-income areas,” Strother said, “they don’t get the opportunities to meet those folks who are going to be able to give them the connection anyway.”
By contrast, Coach, he said, is “like having a mentor in your pocket.”
‘A regulatory desert’
Last month, California state Sen. Steve Padilla, a San Diego Democrat, introduced legislation to protect children from chatbots. Senate Bill 243 would, among other things, limit companies from designing chatbots that encourage users to engage more often, respond more quickly or chat longer. These design elements use psychological tricks to get users to spend more time on the platform, which research indicates can create an addiction that keeps people from engaging in other healthy activities or lead them to form unhealthy emotional attachments to the bots.
The addictive nature of certain apps has long been a critique of social media, especially for young people. In Freeland Fisher’s research for the Clayton Christensen Institute, she included a comment from Vinay Bhaskara, the co-founder of CollegeVine, which released a free AI counselor for high schoolers called Ivy in 2023.
“I’ve seen chat logs where students say, ‘Ivy, thank you so much. You’re like my best friend,’ which is both heartwarming, but also kind of scary. It’s a little bit of both,” the report quotes him as saying.
“We’re creating this army of self-help bots to help students make their way through school and toward jobs, but those very same bots may be eroding the kinds of network-building opportunities that help students break into those jobs eventually.”
Julia Freeland Fisher is education director at the Clayton Christensen Institute
Reached by phone, Bhaskara said his company’s tool is designed to be friendly and conversational so students feel comfortable using it. Millions of students have used the chatbot for free on CollegeVine’s website and more than 150 colleges in California and around the country have offered the technology to their own students. After seeing how many millions of emails, text messages and online chat sessions have happened outside of working hours, Bhaskara now argues the insight and support students have gotten from the chatbot outweigh the risks.
In announcing Padilla’s bill, his office referenced a number of cases in which chatbots directed children who had become attached to them to do dangerous things. At the most extreme, a Florida teen took his own life after a Character.AI chatbot he had become romantically involved with reportedly encouraged him to “come home to me.” Padilla said his bill wouldn’t keep young people from getting the benefits of college and career advising from chatbots; it would offer reasonable guidelines to address a serious need.
“This is a regulatory desert,” Padilla said. “There are no real guardrails around some of this.”
Learn more about legislators mentioned in this story.
Freeland Fisher said the AI companions that young people are turning to for friendship and romantic relationships represent a far greater risk than AI-powered college and career advisors. But she said schools and tech developers still need to be careful when they seek out an AI solution to the counselor shortage.
Maybe the only current danger is replacing conversations with school advisors. Eventually, though, sophisticated tools that capture more of students’ time and attention in the quest to fill a greater need could end up replacing conversations with other adults in their lives.
“These other supports matter down the line,” Freeland Fisher said. When students spend more time with chatbots and, indeed, learn to prefer interactions with bots over humans, it contributes to social isolation that can limit young people’s ability to amass all-important social capital. “That’s part of the warning that we’re trying to build in this research,” Freeland Fisher said. “It’s not to say ‘Don’t use bots.’ It’s just to have a much fuller picture of the potential costs.”
For their part, Making Waves and CareerVillage are taking some responsibility for the risks chatbots represent. Making Waves is actually retiring the AI Copilot this summer as the foundation shifts its mission to finding a way to use technology to help kids build social capital, not just get answers to questions about college and career. And CareerVillage has already put safeguards in place to address some of Padilla’s concerns.
While Coach does tell users the more they interact with the chatbot the more personalized its recommendations become, Chung, the executive director, said Coach is designed to only discuss career development. “If you try to go on a long conversation about something unrelated, Coach will decline,” Chung said. He described a series of guardrails and safety processes the company put in place to make sure users never become emotionally attached to the chatbot.
“It’s work,” Chung said, “but I’m going to be honest with you, it’s not impossible work.”
Data reporter Erica Yee contributed to this reporting.