YouTube channels spreading QAnon conspiracy theories are deleting their own videos to evade punishment, researchers have discovered.
The videos, which spread misinformation and other harmful content that likely violates YouTube’s terms of service, are often left up for several days, racking up tens of thousands of views, before being deleted by their uploaders to avoid triggering suspensions.
This newly discovered tactic is just another example of the trouble YouTube and its parent company Alphabet is facing as it races to prevent extremist content from spreading on the platform. According to an investigation, hundreds of videos have been deleted by a ‘spam network’ comprising more than 40 YouTube accounts and far-right channels, many of which are based internationally while falsely claiming to be American.
One of the videos, discovered by CNET, spread a number of conspiracy theories based on ‘Q drops’ – often vague pieces of information supposedly posted by the anonymous figure behind the QAnon movement – including naming lawmakers as ‘primary targets in DC’ and claiming that President Biden has been secretly executed and replaced by a double.
First uploaded on April 27 by a YouTube channel called ‘It’s Time To Tell The Truth,’ the video gained more than 50,000 views before it was deleted eight days later. An error message on the video confirms it was taken down by its uploader. Some of the most popular videos that were later deleted received over 150,000 views.
Experts believe the new tactic is an explicit attempt avoid having their channels suspended or permanently banned under YouTube’s ‘three-strike’ misinformation policy. Gideon Blocq, CEO of VineSight, a company that uses AI to analyse social media disinformation, told CNET, ‘This is being done for the purpose of not being kicked off the platform. It’s to avoid detection.’
Deleting videos is just one of an arsenal of tricks used by far-right channels to trip up YouTube’s system for detecting violators. Other evasive tactics include using voiceovers and distorted images that are harder for the site’s artificial intelligence safeguards to detect. Noah Schechter, a Stanford University student who first flagged the new method, told CNET the tactic was probably intended to allow the channels to continue to generate revenue through YouTube adverts.
A spokesperson for YouTube said the site was removing the channels found to be using these tactics, confirming in a statement, ‘After careful review, we have terminated the channels flagged to us by CNET for violating our spam policies.’
If you have a story you want to tell, send it to UNILAD via [email protected]