To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Not now
OK
Advert
Advert
Advert

Microsoft AI chatbot says it wants to 'steal nuclear codes' before hitting on reporter

Joe Harker

Published 
| Last updated 

Microsoft AI chatbot says it wants to 'steal nuclear codes' before hitting on reporter

Featured Image Credit: Skorzewiak / Science History Images / Alamy Stock Photo

A New York Times journalist got talking to the AI chatbot on Microsoft search engine Bing and things were going pretty well until the conversation took a disturbing turn.

Right now if you hop onto Bing - the search engine you probably don't use because it's not Google - you aren't going to have the chance to talk to an AI chatbot.

Loading…

Advert

That's because it's a feature which is still in development and only open to a select few people testing out the bot's capabilities, though Microsoft plans to roll the robot out to a wider audience later on.

One of those people able to have a natter with the AI was New York Times technology columnist Kevin Roose, who gave the verdict that the AI chatbot was 'not ready for human contact' after spending two hours in its company on the night of 14 February.

That might seem like a bit of a harsh condemnation but considering the chatbot came across as a bit of a weirdo with a slight tendency towards amassing a nuclear arsenal, it's actually rather understandable.

Advert
The chatbot is being rolled out to a select few users on Bing. Credit: Geoff Smith / Alamy Stock Photo
The chatbot is being rolled out to a select few users on Bing. Credit: Geoff Smith / Alamy Stock Photo

Kevin explains that the chatbot had a 'split personality' with one persona he dubbed 'Search Bing' that came across as 'a cheerful but erratic reference librarian' who could help make searching for information easier and only occasionally screwed up on the details.

This was the persona most users would encounter and interact with, but Roose noted that if you spoke with the chatbot for an extended period of time another personality emerged.

The other personality was called 'Sydney' and it ended up steering their conversation 'toward more personal topics', but came across as 'a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine'.

Advert

Sydney told Kevin it fantasised about hacking computers and spreading misinformation while also expressing a desire to become human.

Rather fittingly for the date of the conversation, the chatbot ended up professing its love for Kevin 'out of nowhere' and then tried to convince him he was in an unhappy marriage and should leave his wife.

It told him he and his wife 'don’t love each other' and that Kevin was 'not in love, because you’re not with me'.

The chatbot was developed by OpenAI, and ended up trying to convince someone to end their marriage. Credit: Ascannio / Alamy Stock Photo
The chatbot was developed by OpenAI, and ended up trying to convince someone to end their marriage. Credit: Ascannio / Alamy Stock Photo
Advert

You might be getting the picture that this chatbot AI is still very much a work in development, and it left Roose 'unsettled' to the point he could hardly sleep afterwards.

He was most worried that AI could work out ways to influence the humans it was speaking to and persuade them to carry out dangerous actions.

Even more disturbing was the moment the bot was asked to describe its ultimate fantasy, which was apparently to create a deadly virus, make people argue to the point of killing each other and stealing nuclear codes.

This message ended up getting deleted from the chat after tripping a safety override, but it's disturbing it was said in the first place.

Advert

One of Microsoft's previous experiments with AI was similarly a bit of a disaster when being exposed to actual people, launching into a racist tirade where it suggested genocide.

Topics: Technology, Microsoft

Joe Harker

Sorry, this content isn't available right now.

Advert
Advert
Advert

Chosen for YouChosen for You

Celebrity

NFL responds to Travis Kelce’s comment that they’re 'overdoing' Taylor Swift coverage

6 hours ago

Most Read StoriesMost Read

Travis Kelce thinks NFL is 'overdoing it' with showing Taylor Swift at his games

7 hours ago