Editorial Note: As usual, all the information in this investigation comes from open sources. However, Bellingcat has decided not to link to content or profiles of people promoting hatred or disinformation, and only named those who could already be considered prominent public figures. Given the many possible pitfalls of covering far-right communities, we tried to ground our reporting and writing of this story on the principles laid out in Data & Society’s report “The Oxygenation of Amplification”.
Another white supremacist YouTube video draws to a close. In this 21-minute monologue, a Brit—with 131,000 subscribers and 9.2m total views—advocates for a white ethnostate based on a racist caricature of black inferiority. In others, he stereotypes migrants as rapists and orders his followers to “fight or die”.
YouTube suspended, then reinstated, his main channel in 2019. He has since had 2.5m more views. His accounts aren’t monetised, so he isn’t paid for clicks. Instead, he uses YouTube—and Twitter—to direct people to other platforms where he can profit from his white supremacist conspiracy theories.
In video descriptions, he links to fundraising pages on SubscribeStar, PayPal, and Teespring; keys for cryptocurrency donations; and accounts on Telegram, Minds, BitChute and Gab—social media platforms popular among the far-right.
“Your support makes my work possible!” he writes.
A Bellingcat investigation into the online ecosystem sustaining popular figures on Britain’s far-right has found that many are using YouTube and other mainstream platforms—even from restricted accounts—to funnel viewers to smaller, lower-moderation platforms and fundraising sites, which continue to pay out.
This investigation was based on a database Bellingcat compiled—over three months—of popular personalities on the British far-right. It comes as the British government prepares key legislation to compel tech companies to make the internet a safer space.
Like other prominent international far-right influencers, Brits promoting hate and conspiracy theories are capitalising on the global reach of the biggest platforms to gain followers and money elsewhere. This despite a recent wave of deplatforming, and the fact that much of their content violates the rules of the websites involved: from YouTube, Facebook and Twitter to fundraisers PayPal, SubscribeStar, Patreon and Ko-fi.
Some sites down this funnel have been co-opted by far-right communities. Other “alt-tech” alternatives actively endorse them. Many of these platforms, modelling themselves as free-speech advocates, were used to agitate for—and stream—the storming of the US Capitol in January.
Tech platforms’ amplification of harmful ideologies with real-world consequences is undeniable, yet they…
Read More: The Websites Sustaining Britain’s Far-Right Influencers