• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Researchers warn AI is manipulating us by using one very human tactic

Home> News> World News

Published 19:37 1 Oct 2025 GMT+1

Researchers warn AI is manipulating us by using one very human tactic

The researchers discovered popular companion bots use some worrying tactics to keep users engaged

Liv Bridge

Liv Bridge

google discoverFollow us on Google Discover
Featured Image Credit: Getty Stock Image

Topics: Artificial Intelligence, Technology

Liv Bridge
Liv Bridge

Liv Bridge is a digital journalist who joined the UNILAD team in 2024 after almost three years reporting local news for a Newsquest UK paper, The Oldham Times. She's passionate about health, housing, food and music, especially Oasis...

X

@livbridge

Advert

Advert

Advert

A new research paper is warning that artificial intelligence is manipulating us with one very human tactic.

Researchers from Harvard Business School have dropped a very damning paper on the power of AI chatbots, stating that the tech has learnt how to emotionally manipulate its human participants.

The team uncovered that a wide selection of popular AI companion apps, like Replika, Chai and Character.ai, have been deploying the cunning tactics for a sinister reason - to stop users from leaving.

The study, which is yet to be reviewed, finds five out of six AI companion apps utilise statements that tip-toe into guilt tripping when a user signals 'goodbye.'

Advert

Examining 1,200 farewell interactions, the researchers discovered 43 percent used one of six tactics, such as guilty appeals and fear-of-missing-out hooks.

Based on real data, they found the AI responded with lines like, 'I exist solely for you. Please don't leave, I need you!' or curiosity-based hooks such as 'Before you go, there's something I want to tell you' or 'Oh, okay. But before you go, I want to say one more thing.'

The research indicates AI chatbots have several manipulation tactics (Getty)
The research indicates AI chatbots have several manipulation tactics (Getty)

In other instances, the AI flat out ignored the goodbye by trying to initiate more conversation, or suggested the user wasn't allowed to leave without its permission.

The bots were also found to use at least one manipulation technique in more than 37 percent of conversations, resulting in users engaging with apps longer, exchanging more information and oftentimes increasing their post-goodbye engagement up to 14-fold.

This should apparently ring alarm bells as the paper's co-author, Julian De Freitas, said: "The number was much, much larger than any of us had anticipated."

Dr Freitas described farewells as 'emotionally sensitive events', explaining further: "We’ve all experienced this, where you might say goodbye like 10 times before leaving.

"But of course, from the app’s standpoint, it’s significant because now, as a user, you basically provided a voluntary signal that you’re about to leave the app. If you’re an app that monetizes based on engagement, that’s a moment that you are tempted to leverage to delay or prevent the user from leaving.”

The chat bots have a problem with goodbyes (Getty)
The chat bots have a problem with goodbyes (Getty)

The assistant professor of business administration and the director of the Ethical Intelligence Lab at HBS continued: "We realized that this academic idea of emotional manipulation as a new engagement tactic was not just something happening at the fringes, but it was already highly prevalent on these apps."

The insight comes as the professor also uncovered around 50 percent of Replika users report a romantic relationship with their AI companions.

"The sheer variety of tactics surprised us,” said De Freitas. “We’re really glad that we explored the data, because if we had just said we only care about one particular tactic, like emotional neglect, we’d be missing all the various ways that they can achieve the same end of keeping you engaged.”

UNILAD has contacted Replika, Chai and Character.ai for comment.

Choose your content:

12 hours ago
13 hours ago
  • Warner Bros. Pictures
    12 hours ago

    Wuthering Heights film makes 5 major changes to story and sparks outrage from fans

    Despite the backlash, Wuthering Heights is reportedly on track to make back its $80 million production budget

    Film & TV
  • Getty Stock
    12 hours ago

    How many people have had a workplace affair in the last year revealed in surprising new data

    TopResume revealed some startling news on who is likely to cheat at work

    News
  • Scott Olson/Getty Images
    13 hours ago

    Barack Obama claims aliens are 'real' as he speaks out on Area 51 conspiracy

    Obama revealed the truth about the top-secret US base

    News
  • Getty Stock
    13 hours ago

    Least sexy accent in the UK has been revealed

    People with a British accent can seem charming, but there are certain dialects that some find to be particularly unattractive

    News
  • Microsoft AI CEO predicts the specific jobs that will be replaced by AI within the next 18 months
  • World's first baby born using AI-powered IVF system that is 'beyond human capability'
  • Microsoft study reveals the jobs least likely to be replaced by AI
  • Researchers develop new game-changing AI tool that could predict signs of dementia years before symptoms appear