Researchers warn AI is manipulating us by using one very human tactic

Home> News> World News

Researchers warn AI is manipulating us by using one very human tactic

The researchers discovered popular companion bots use some worrying tactics to keep users engaged

A new research paper is warning that artificial intelligence is manipulating us with one very human tactic.

Researchers from Harvard Business School have dropped a very damning paper on the power of AI chatbots, stating that the tech has learnt how to emotionally manipulate its human participants.

The team uncovered that a wide selection of popular AI companion apps, like Replika, Chai and Character.ai, have been deploying the cunning tactics for a sinister reason - to stop users from leaving.

The study, which is yet to be reviewed, finds five out of six AI companion apps utilise statements that tip-toe into guilt tripping when a user signals 'goodbye.'

Examining 1,200 farewell interactions, the researchers discovered 43 percent used one of six tactics, such as guilty appeals and fear-of-missing-out hooks.

Based on real data, they found the AI responded with lines like, 'I exist solely for you. Please don't leave, I need you!' or curiosity-based hooks such as 'Before you go, there's something I want to tell you' or 'Oh, okay. But before you go, I want to say one more thing.'

The research indicates AI chatbots have several manipulation tactics (Getty)
The research indicates AI chatbots have several manipulation tactics (Getty)

In other instances, the AI flat out ignored the goodbye by trying to initiate more conversation, or suggested the user wasn't allowed to leave without its permission.

The bots were also found to use at least one manipulation technique in more than 37 percent of conversations, resulting in users engaging with apps longer, exchanging more information and oftentimes increasing their post-goodbye engagement up to 14-fold.

This should apparently ring alarm bells as the paper's co-author, Julian De Freitas, said: "The number was much, much larger than any of us had anticipated."

Dr Freitas described farewells as 'emotionally sensitive events', explaining further: "We’ve all experienced this, where you might say goodbye like 10 times before leaving.

"But of course, from the app’s standpoint, it’s significant because now, as a user, you basically provided a voluntary signal that you’re about to leave the app. If you’re an app that monetizes based on engagement, that’s a moment that you are tempted to leverage to delay or prevent the user from leaving.”

The chat bots have a problem with goodbyes (Getty)
The chat bots have a problem with goodbyes (Getty)

The assistant professor of business administration and the director of the Ethical Intelligence Lab at HBS continued: "We realized that this academic idea of emotional manipulation as a new engagement tactic was not just something happening at the fringes, but it was already highly prevalent on these apps."

The insight comes as the professor also uncovered around 50 percent of Replika users report a romantic relationship with their AI companions.

"The sheer variety of tactics surprised us,” said De Freitas. “We’re really glad that we explored the data, because if we had just said we only care about one particular tactic, like emotional neglect, we’d be missing all the various ways that they can achieve the same end of keeping you engaged.”

UNILAD has contacted Replika, Chai and Character.ai for comment.

Featured Image Credit: Getty Stock Image

Topics: Artificial Intelligence, Technology