Concern Grows Over Apple’s Decision To Scan Messages And Photos For Child Abuse

Cameron Frew

| Last updated 

Concern Grows Over Apple's Decision To Scan Messages And Photos For Child AbusePA Images

Apple has been criticised for its decision to scan users’ devices for child sexual abuse material.

As part of plans for a new update, photos will be scanned for indecent material (branded CSAM) before the image is stored onto iCloud, also comparing it with a database of child abuse images provided by safeguarding organisations. If flagged, Apple will then manually review the content and flag it to the authorities if it is deemed abusive.

The upcoming versions of iOS and iPadOS, expected to drop later this year, will have ‘new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy’.

iPhones will be scanned for abusive material. (Pexels)Pexels

The move has drawn the concerns of WhatsApp head Will Cathcart. ‘I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no,’ he tweeted.

While supporting the principle behind the tech, ‘the approach they are taking introduces something very concerning into the world,’ Cathcart wrote in a thread.

‘Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone – even photos you haven’t shared with anyone. That’s not privacy.

‘This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.

‘What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?’

If you have a story you want to tell, send it to UNILAD via [email protected]

Topics: Technology, Apple, child abuse, iPhone, Privacy

Cameron Frew
More like this

Chosen for YouChosen for You

Film & TV

New Tim Burton Wednesday series on Netflix has broken Addams Family record

5 hours ago

Most Read StoriesMost Read

Cardi B threatened with legal action over Marge Simpson Halloween costume

7 hours ago