Breaking News

Political extremists are using YouTube to monetize their toxic ideas

YouTube has become a breeding ground for political extremism.Anadolu Agency / Getty Image Looking for news you can trust?Subscribe to our free newsletters. If you search for “Federal Reserve” on YouTube, one of the first videos to surface is entitled “Century of Enslavement.” Using archival footage and the authoritative male voice from countless historical documentaries, the 90-minute video espouses the idea that the Federal Reserve was formed in secret by powerful, often Jewish, banking families in the early 20th century, causing America to spiral into debt. 
With over 1.6 million views, the video is categorized as “News and Politics.” It was created by a channel called the Corbett Report that also boasts documentaries touting conspiracy theories that 9/11 was staged by the U.S. government and global warming as a hoax. The video quickly leads a user down a rabbit hole of ” recommended videos” about Illuminati conspiracy theories and those that blame Israel for 9/11. 
The incendiary Federal Reserve video, flagged by MSNBC host Chris Hayes, earlier this month, is just one of many examples of how political extremists have mastered YouTube’s algorithms and monetization structure to spread toxic ideas ranging from conspiracy theories to white supremacy. The video “Why Social Justice is CANCER,” for instance, appears after searching for “social justice.” 
The site has also become home for live-streamed, difficult to moderate debates on topics such as “scientific racism” where the two sides always reach agreement with no alternative position discussed. Over 10,000 active viewers watched a January debate between white nationalist Richard Spencer and a conspiracy theorist who goes by the name Sargon of Akkad. The discussion topic was if science proved that whites were more intelligent because of genetic superiority. There was little disagreement; both sides agreed on the superiority of the white race. Within weeks nearly half a million people had viewed it. 
YouTube’s parent company Google only had a limited presence during July’s Congressional hearings on social media’s role in propagating misinformation and political bias, where Facebook and Twitter were the primary targets. But YouTube as an individual company has managed to avoid much scrutiny.  With nearly 2 billion unique users a month, the platform is used by 94 percent of 18-24 year olds regularly, and one in five of this group go first to YouTube to get their news, according to the Pew Research Center. With such a large audience and little oversight, many political extremists have turned to YouTube to spread political extremism and make money doing it.
In a report released today by the tech institute Data and Society  entitled “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” media manipulation researcher Becca Lewis examines how political extremists have created a deeply connected “alternative influencer network” on YouTube. They have used collaborations like staged debates to tie together users who promote a range of political positions, some of which are extremist.

YouTube posters like “roaming millennial” co-opt language from the left to try to radicalize users to more extreme views like white supremacy.
YouTube
Lewis, who is a PhD Candidate in Communications at Stanford University, spent a year analyzing more than 80 YouTube channels that connected users with positions ranging from mainstream liberalism to white supremacy. What she found was that YouTube hasn’t just provided a platform for ideas formerly relegated to anonymous internet forums like 4Chan, it has helped to monetize it. 
So-called “dark web intellectuals”—like anti-feminist psychology professor Jordan Peterson—are connected with more radical ideologues like white nationalist Richard Spencer, to broadcast what Lewis describes as “extremely harmful, in many cases racist and sexist content.” Content creators adopt the position of being marginalized  cultural underdogs while reaching an audience of millions. By appearing on each others channels and in staged debates with one another, these alternative content creators have created an intricate network that makes political extremism easily accessible on YouTube.
Mother Jones spoke to Lewis about her report and how YouTube has become a breeding ground for political extremism.
Mother Jones: By far, it seems as if YouTube gets much less attention than Facebook and Twitter when we talk about political extremism and misinformation. Why is that?
Becca Lewis: We’ve gotten a really clear picture of the types of fake news that disseminates on Facebook that was created by Macedonian teenagers. We’ve gotten a really clear idea of what can happen when Donald Trump retweets a tweet that originated from an anti-Semitic meme on an anonymous forum. We don’t have as clear a picture of what’s happening on YouTube and Google. It is important to bring to the fore some illustrations of the problems that do exist on these platforms. I’m trying to show that there are fundamental issues we need to be addressing with YouTube in the same way that we have recognized fundamental issues with Facebook and Twitter.
MJ:  YouTube is also one of the only platforms that offers financial incentives to creators through its Partner Program, which allows them to make money off of ads on their videos. Can you explain how this incentivizes more extreme behavior?
BL:  One of the troubling implications of the report is that these issues can’t be fixed with a simple tweak here or there, because they are built into the monetization structure of YouTube. One thing that makes YouTube so appealing to influencers and viewers alike is the fact that viewers can interact directly with the people who are making content and have a more intimate relationship than viewers or readers have with mainstream news outlets. 
“One of the troubling implications of the report is that these issues can’t be fixed with a simple tweak here or there, because they are built into the monetization structure of YouTube.”
At the same time, an influencer who is making content, in most cases, is also trying to make money off of it. So when they have viewers who are telling them to keep making more and more extremist content, they have a direct financial incentive to do so. It speaks to the larger culture of metrics in newsrooms and the emergence of clickbait, but it’s particularly pronounced in a very specific way on YouTube. And I think you see people going down these [ideological] paths that they might not otherwise because it’s financially incentivized to do so.
MJ: One way these posters define themselves is by saying they are underdogs who are being attacked by mainstream society. Do you have any thoughts on how to de-platform or de-monetize these creators if they just turn around and point to those efforts as examples of the very discrimination they can use to bolster their claims?
BL: That’s a fundamental question that has been plaguing academics and tech firms alike. My  interpretation is that the framing of social underdog paranoia thrives when content moderation and platforming happens inconsistently and without a clear explanation. And the fact is that if [extremists] were being consistently de-platformed, they wouldn’t be able to make content about it.
Someone who has talked a lot about censorship is Paul Joseph Watson, the influencer on YouTube who is affiliated with Info Wars. Most recently, after Alex Jones was removed from YouTube, Watson made a video called “(((Censored))),” which signals anti-Semitic themes while discussing alleged conservative bias on social media platforms.  But at the same time, he has over a million followers, a million subscribers on YouTube. And as part of the YouTube Partner Program he has received a plaque from YouTube for influencers who passed 1 million followers. So here he is able to provide that narrative of censorship while getting influencer treatment from the platform. 
MJ: There’s been a lot of discussion in Congress about how to best regulate social media platforms. Do you have any thoughts on what actions should be taken by either tech companies or law makers to reign in this problem?
BL:  Up until now these platforms have largely been given carte blanche; they have evaded a regulation to a large extent. So even the shifting nature of conversation, the fact that these platforms now are facing pressure externally is a promising sign. Even though you could debate how much actually came out of the Congressional hearings, I think that it’s a promising sign that they have started. In terms of talking about solutions, we need to be approaching these problems from multiple tracks. I absolutely think reassessing the algorithms [that surface extremist content] is one step that needs to be taken. Assessing what government regulation options are available is absolutely worthwhile, and then thinking about how YouTube monetization structures incentivize certain behaviors is something that needs to be done. It needs to be a multi-pronged solution.

Comments are closed.