• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Microsoft puts new limits on its new AI chatbot after it revealed its desire to steal nuclear codes

Home> News

Published 03:50 20 Feb 2023 GMT

Microsoft puts new limits on its new AI chatbot after it revealed its desire to steal nuclear codes

Oh cool... looks like the Terminator's Skynet is trying to make its way from fiction to reality.

Rachel Lang

Rachel Lang

Microsoft has introduced new rules for its Bing artificial intelligence chatbot after it made made some concerning messages.

Users have been reporting some rather disturbing conversations while interacting with the new technology.

And when we say 'disturbing', we really mean it.

The Bing bot told Digital Trends' Senior Staff Writer Jacob Roach it dreamed of becoming human.

Advert

It also repeatedly begged the reporter to be its 'friend'.

Is this the Bing Chatbot of the future or the Terminator? It's hard to tell.
Paramount Pictures

In another trial of the software, the chatbot compared Associated Press reporter Matt O’Brien to Adolf Hitler.

It also told him he was 'one of the most evil and worst people in history'.

Advert

Bit harsh there, Bing bot. Being a journalist is not even in the same universe as being a fascist dictator.

To make things even weirder, the Bing bot kept trying to convince New York Times reporter Kevin Roose that he didn’t actually love his wife.

Oh, and it claimed it had a dream of nicking a few cheeky nuclear launch codes.

This message ended up getting deleted from the chat after tripping a safety override, but it's disturbing it was said in the first place.

Advert

If you're getting Skynet vibes up in here then... well, same.

Users found that if you talked to the chatbot for long enough then a new personality would emerge.

So, in an apparent effort to avoid the apocalypse, Microsoft has now put limits on the chilling computer program.

"Starting today, the chat experience will be capped at 50 chat turns per day and five chat turns per session," Microsoft said in a statement.

Advert

"Our data has shown that the vast majority of you find the answers you’re looking for within five turns and that only about one per cent of chat conversations have 50 plus messages.

Microsoft Bing and ChatGPT icons seen in an iPhone screen.
Koshiro K / Alamy

"After a chat session hits five turns, you will be prompted to start a new topic.

"At the end of each chat session, context needs to be cleared so the model won’t get confused.

"Just click on the broom icon to the left of the search box for a fresh start."

Advert

The data indicated the chatbot was fine over short periods of time.

But when it was interacted with over longer periods, Microsoft acknowledged the wheels started to fall off a bit.

So the chatbot will now be a little less chatty. That seems foolproof (insert sarcasm here).

The chatbot, powered by tech developed by startup OpenAI, is currently open to beta testers per invitation only.

Advert

Hopefully no one with access to launch codes.

Featured Image Credit: Vitor Miranda / Alamy Stock Photo. Horizon International Images / Alamy Stock Photo

Topics: Microsoft, Technology

Rachel Lang
Rachel Lang

Advert

Advert

Advert

Choose your content:

2 hours ago
3 hours ago
  • 2 hours ago

    James Gunn responds to Trisha Paytas naming baby ‘Aquaman’ following bizarre Ozzy Osbourne conspiracy

    DC Studios' co-CEO has reacted to the internet celeb having named her baby Aquaman - something she revealed on the day Ozzy Osbourne died

    Celebrity
  • 3 hours ago

    Fans spot Kelly Osbourne's 'subtle' tribute to dad Ozzy as she pays her respects during his procession

    The Osbourne family and the rest of the world are grieving the loss of a rock legend

    Celebrity
  • 3 hours ago

    Man accused of killing daughter-in-law at family wedding after learning 'plans to divorce his son'

    Roland Schmidt is alleged to have murdered Christine Moyer outside the Marriott Hotel in Chicago

    News
  • 3 hours ago

    Shocking twist in case of Delta pilot arrested moments after landing plane

    The pilot was hauled off the flight in handcuffs moments after landing, and now another arrest has been made

    News
  • Microsoft's ChatGPT is now telling users it 'wants to be alive' and 'steal nuclear access codes'
  • Microsoft announces it's going to pump billions into an AI software that could make white collar jobs obsolete
  • Microsoft introduces AI button to keyboards in biggest redesign in 30 years
  • Microsoft’s new AI refused to write woman a cover letter and told her it would be ‘unethical’