
Topics: Artificial Intelligence, Social Media, Mental Health, Health, Technology

Topics: Artificial Intelligence, Social Media, Mental Health, Health, Technology
Health experts have issued a warning after revealing the results of a new study, which concludes that an addiction to AI should be recognised as a mental illness.
Since AI chatbots surged in popularity in 2023, many people have become increasingly reliant on the assistance they provide.
Whether it's finding out a random fact, planning a vacation itinerary, general admin, or even life advice, we’ve all turned to an AI chatbot at one time or another.
Many people have begun using the service much like a therapist, noting its ability to reason, understand, and respond thoughtfully.
Advert
But health experts have now issued a warning, after seeing an increasing number of cases in which people have become dependant on the chatbot.
Dr Dongwook Yoo, author of a new paper on AI addiction, said that concerns are mainly focused on younger users, who are experimenting with the boundaries of the service.
Young people are roleplaying complex fantasies, voicing their frustrations, and seeking emotional connection with AI, according to the new research.

The health expert warned: "AI addiction is a growing problem causing many harms, yet some researchers deny it's even a real issue.
"And deliberate design decisions by some of the corporations involved are contributing, keeping users online regardless of their health or safety."
Efforts to formalise digital addictions have proved tricky in recent years, as scientists have laid out extremely tight criteria for what should be considered an 'addiction'.
The criteria is as follows: salience (becomes the most important thing in a person's life), tolerance (the amount it's used increases), mood modification, conflict (it causes problems with others), withdrawal symptoms, and a tendency for relapse.
AI addiction has a dedicated forum on Reddit, where hundreds of users are conversing about their dependency on the chatbot services - many in their early and late teens.
One wrote: "At first I just thought it was interesting that I could get a response out of saying basically anything.
"Aside from being able to have basically any conversation I wanted, they also said whatever I wanted to hear. I think that spoke to the part of me that didn't always feel listened to or understood."

They added: "I neglected other parts of my life in favour of it, especially socially. It didn't feel that different from talking to a real person at times, so I'd sometimes talk to it more than I'd talk to a friend."
But they're not the only one.
Another young person on the forum admitted that they once didn't sleep for a whole night, and 'stayed up talking to chatbots'.
Speaking to the Daily Mail, Karen Shen, lead author of the paper, concluded: "Our findings suggest that a central mechanism underlying addictive use is how users can get exactly anything they want with minimal effort.
"Our findings show that users report symptoms such as conflict and relapse that are comparable to those reported for behavioural addictions, which do have formal diagnoses."
If you or someone you know is struggling or in crisis, help is available through Mental Health America. Call or text 988 to reach a 24-hour crisis center or you can webchat at 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.