This article is more than 1 year old

Friends don't let friends use AI to chat

Science shocker: Real BFFs understand authenticity and sincerity can't be machine-generated

People feel less confident in their friendships if they discover that their buddies have been sending them messages written with the help of AI.

Academics mainly at Ohio State University conducted an experiment in which 208 participants were asked to feign friendship with a fictional entity named Taylor. Participants were told to imagine that they were experiencing burnout, experiencing issues with a work colleague, or reminding Taylor of their upcoming birthday.

Each participant was told to type a message to Taylor discussing each hypothetical scenario in the hopes of soliciting support, advice, or just friendly chit-chat. They then received a reply from the made-up Taylor and each person thought the responses were "thoughtful."

The group was then split into three of equal size, with each sub-group offered a different explanation for how Taylor composed its replies:

  • Without help from AI or another human;
  • Using AI;
  • With the assistance of another human.

Participants were then asked how they felt about Taylor’s methods.

"What we found is that people don't think a friend should use any third party – AI or another human – to help maintain their relationship," Bingjie Liu, lead author of the study and an assistant professor of communication at the US college, wrote in a statement on Monday.

Liu said that people "feel less satisfied" if their friends turned to AI to send them messages, and that they were more likely to "feel more uncertain about where they stand" in their relationships.

"Effort is very important in a relationship," she added. "People want to know how much you are willing to invest in your friendship and if they feel you are taking shortcuts by using AI to help, that's not good." The researchers' conclusions are similar to another separate study that found that people tend to think less of others if they use AI in text conversations due to the lack of authenticity.

As Big Tech increasingly deploys AI-writing tools to assist users put together reports, chats, or emails, it may seem natural for users to start using the software to communicate with friends. But Liu warned against doing this since it could lead to suspicion and disrupt friendships.

"It could be that people will secretly do this Turing Test in their mind, trying to figure out if messages have some AI component," Liu said. "It may hurt relationships. Don't use technology just because it is convenient. Sincerity and authenticity still matter a lot in relationships," she warned.

The details of the researchers experiment and results were published in the Sage Journals' Journal of Social and Personal Relationships. ®

More about

TIP US OFF

Send us news


Other stories you might like