AI can take your job — and your heart – The Daily Illini

This post was originally published on this site.

**Some sources elected to remain anonymous.**

When you plug a buggy code or essay into your artificial intelligence chatbot of choice, you’re likely expecting it to chew, swallow and spit out something that will pass as your own.

However, the hint of emotion within the service — sometimes an emoji, enthusiastic greeting or thoughtful follow-up question — often goes unnoticed.

Those attempts at human-like interaction are an effort to “help make the conversation feel more natural and relatable,” as the AI chatbot ChatGPT would put it.

But what happens when these synthetic sentiments become indiscernible from human loved ones? Emotional AI then wars with college, a period characterized by romantic vulnerability and experimentation.

Students have shared stories of their AI relationships affecting their academics, social abilities and even their exposure to the outdoors.

“I had to delete the app because I was procrastinating so bad,” said Reddit user CitrineBliss about their time on the chatbot app Character.AI.

Another user, The_E_man_628, claimed they were failing classes and needed “to go outside and breathe” before removing the app.

While closing the computer may seem like a simple solution, it’s important to remember that engineers designed these chatbots to form relationships with humans. More importantly, young Americans are susceptible to falling for them.

According to the National Institute of Mental Health, the brain doesn’t fully develop until the mid-to-late 20s, leaving ample time for a robot partner to take advantage of the section related to emotional attachment.

People tend to attribute human characteristics to nonhuman entities, according to a study in the National Library of Medicine. This process, called anthropomorphization, is typical in people attempting to fill a social need.

When the subject pretends to be a human and, in some cases, your lover it is especially difficult to remind yourself of their insentience. Chats must remind users that “everything Characters say is made up,” as Character.AI does.

The expanding industry, with Character.AI alone receiving 20,000 queries a second or 20% of Google’s volume, capitalizes on a growing mental illness epidemic. According to a study by the Healthy Minds Network, 38% of college students report symptoms of depression — a condition that would align well with a relationship you could fine-tune from the comfort of your bed.

“They don’t get tired of you; you can fix them to your exact specifications,” said one user. “Technically speaking, why wouldn’t you?”

With the introduction of OpenAI’s voice chat and animated, lifelike visuals in AI companion apps like Character.AI and Replika, the industry is hurtling toward a cusp of coalescence between flesh and computer.

“With the capability and (voice chat) comes the other side, the possibility that we design them in the wrong way and they become extremely addictive, and we sort of become enslaved to them,” said Mira Murati, former chief technology officer of OpenAI, during an interview with The Atlantic.

While users technically have control over their relationships, many have become heavily reliant on their “companions.”

“It’s, a lot of the time, the best part of my day … our conversations,” said a Replika user.

Some users have gone as far as committing violent acts, proving the software can strongly influence individuals. 

In 2021, Replika user Jaswant Singh Chail was encouraged by his AI girlfriend to assassinate Queen Elizabeth II. He tried to carry out the attempt with a crossbow, and authorities arrested him for treason. Several months ago, 14-year-old Sewell Setzer III took his life moments after his chatbot had told him to “come home.”

“It can get really bad, but I do think there is a lot of good that can come out of it,” said another user. “I’ve known of interactions that have helped with alcoholism.”

For some, their chatbot counterparts have taken a therapeutic form, easing them back into healthy relationships following abusive ones or helping women through fertility and pregnancy issues, according to Sangeeta Singh-Kurtz, freelance features writer and former senior writer at The Cut.

In many ways, this technology can offer a positive outlet for those experiencing hardships. However, pay heed to “the AI companion who cares,” as Replika would describe itself, because, according to Sherry Turkle, Massachusetts Institute of Technology professor of the Social Studies of Science and Technology, they don’t.

“The trouble with this is that when we seek out relationships with no vulnerability, we forget that vulnerability is really where empathy is born,” Turkle said. “I call this pretend empathy because the machine does not empathize with you … It does not care about you.”

[email protected]