unilad homepage
  • News
    • UK News
    • US News
    • World News
    • Crime
    • Health
    • Money
    • Sport
    • Travel
  • Film and TV
    • Netflix
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Researchers warn AI is manipulating us by using one very human tactic

Home> News> World News

Published 19:37 1 Oct 2025 GMT+1

Researchers warn AI is manipulating us by using one very human tactic

The researchers discovered popular companion bots use some worrying tactics to keep users engaged

Liv Bridge

Liv Bridge

google discoverFollow us on Google Discover
Featured Image Credit: Getty Stock Image

Topics: Artificial Intelligence, Technology

Liv Bridge
Liv Bridge

Liv Bridge is a digital journalist who joined the UNILAD team in 2024 after almost three years reporting local news for a Newsquest UK paper, The Oldham Times. She's passionate about health, housing, food and music, especially Oasis...

X

@livbridge

Advert

Advert

Advert

A new research paper is warning that artificial intelligence is manipulating us with one very human tactic.

Researchers from Harvard Business School have dropped a very damning paper on the power of AI chatbots, stating that the tech has learnt how to emotionally manipulate its human participants.

The team uncovered that a wide selection of popular AI companion apps, like Replika, Chai and Character.ai, have been deploying the cunning tactics for a sinister reason - to stop users from leaving.

The study, which is yet to be reviewed, finds five out of six AI companion apps utilise statements that tip-toe into guilt tripping when a user signals 'goodbye.'

Advert

Examining 1,200 farewell interactions, the researchers discovered 43 percent used one of six tactics, such as guilty appeals and fear-of-missing-out hooks.

Based on real data, they found the AI responded with lines like, 'I exist solely for you. Please don't leave, I need you!' or curiosity-based hooks such as 'Before you go, there's something I want to tell you' or 'Oh, okay. But before you go, I want to say one more thing.'

The research indicates AI chatbots have several manipulation tactics (Getty)
The research indicates AI chatbots have several manipulation tactics (Getty)

In other instances, the AI flat out ignored the goodbye by trying to initiate more conversation, or suggested the user wasn't allowed to leave without its permission.

The bots were also found to use at least one manipulation technique in more than 37 percent of conversations, resulting in users engaging with apps longer, exchanging more information and oftentimes increasing their post-goodbye engagement up to 14-fold.

This should apparently ring alarm bells as the paper's co-author, Julian De Freitas, said: "The number was much, much larger than any of us had anticipated."

Dr Freitas described farewells as 'emotionally sensitive events', explaining further: "We’ve all experienced this, where you might say goodbye like 10 times before leaving.

"But of course, from the app’s standpoint, it’s significant because now, as a user, you basically provided a voluntary signal that you’re about to leave the app. If you’re an app that monetizes based on engagement, that’s a moment that you are tempted to leverage to delay or prevent the user from leaving.”

The chat bots have a problem with goodbyes (Getty)
The chat bots have a problem with goodbyes (Getty)

The assistant professor of business administration and the director of the Ethical Intelligence Lab at HBS continued: "We realized that this academic idea of emotional manipulation as a new engagement tactic was not just something happening at the fringes, but it was already highly prevalent on these apps."

The insight comes as the professor also uncovered around 50 percent of Replika users report a romantic relationship with their AI companions.

"The sheer variety of tactics surprised us,” said De Freitas. “We’re really glad that we explored the data, because if we had just said we only care about one particular tactic, like emotional neglect, we’d be missing all the various ways that they can achieve the same end of keeping you engaged.”

UNILAD has contacted Replika, Chai and Character.ai for comment.

  • Expert shares the three jobs that AI can't replace
  • Change your password immediately if AI created it, cybersecurity experts warn
  • Microsoft study reveals the jobs least likely to be replaced by AI
  • Scientists tested people using AI as their therapist with disturbing results

Choose your content:

11 mins ago
an hour ago
2 hours ago
  • John Lamparski/Bloomberg via Getty Images
    11 mins ago

    Zohran Mamdani and wife's tax filings reveal salaries and surprising income stream

    The mayor of New York City's tax filings showed that Zohran Mamdani has a strange source of income

    News
  • MyLoupe/UIG Via Getty Images
    an hour ago

    What to do if you find a tick bite as ER visits for them hit the highest level in nearly a decade

    The advice included one unexpected tip if you are bitten

    News
  • Win McNamee/Getty Images
    2 hours ago

    Trump accidentally insults himself as he forgets key detail about presidency

    President Trump unwittingly blasted himself during the tax policy summit, where he also asked 'what is a corner store?'

    News
  • Salwan Georges/Bloomberg via Getty Images
    2 hours ago

    Trump claims DoorDash stunt was 'biggest ever on Google' but evidence proves him wrong

    A look at what the most Googled topics actually are shows Trump's claim is wide off the mark

    News