Social media is a breeding ground for hateful content, and few corners of the internet are as rife with extremism as YouTube, which The New York Times called “The Great Radicalizer.” The platform has taken up the herculean task of cleaning up its act.

The latest act — a prohibition on and banning of supremacist content — bans any content that argues for perceived superiority of any group to justify discrimination based on age, gender, race, caste, religion, sexual orientation, or veteran status. The move is expected to result in thousands of videos and channels being removed.

“As with other outlets before it, YouTube’s decision to remove hateful content depends on its ability to enact and enforce policies and procedures that will prevent this content from becoming a global organizing tool for the radical right,” said anti-discrimination watchdog the Southern Poverty Law Center in a statement.

But that enforcement is already being called into question, based on the platform’s response to conservative commentator Steven Crowder. Crowder regularly launched racist and homophobic slurs at Vox reporter Carlos Maza, among other things calling Maza a “lispy queer.” Crowder also doxxed Maza.

YouTube defended Crowder, and took no action over the treatment of Maza. In fact, Crowder was temporarily demonitized not over harassment or his homophobia and racism but only until he stops selling homophobic shirts. Crowder makes nearly $2 million annually from his channel’s monetization.

Big tech companies have a hard time dealing with hateful content. Some efforts have been laudable, but how policies are enforced is as important, if not more so, than what those policies are. And this is never more true than with YouTube, whose algorithms actually push extremist content.

“It has taken Silicon Valley years to acknowledge its role in allowing this toxic environment to exist online,” read SPLC’s statement. “Whether this rabbit hole of dangerous rhetoric is due to a flaw in the algorithm or that the algorithm is too easily manipulated, the end result is the same.”

And even when action is taken against YouTubers, it creates a sense of victimhood among those abusing the platform’s Terms of Service and spreading radical, hateful content. Crowder lamented how a reporter from a “big media company,” namely Vox, was trying to curtail his freedom of speech. This was a similar approach to the one conservative conspiracy theorist Alex Jones took when YouTube took action against him as well.

Although that’s not how free speech works, it is an easy tactic for those abusing social media to take when earned punishments come to call.

“The ad revenue isn’t the problem. It’s the platform,” Maza said in a series of tweets. “The problem is that @YouTube allows monsters and bullies to become superstars, break YouTube’s rules, build an army of loyal, radicalized followers, and then make millions selling them merch that sustains their work.”

What impact YouTube’s policy will have remains to be seen, but the Crowder situation encourages taking that policy change with a grain of salt.

“Tech companies must proactively tackle the problem of hateful content that is easily found on their platforms before it leads to more hate-inspired violence,” the SPLC warned.

 

Katelyn Kivel is a contributing editor and senior legal reporter for Grit Post in Kalamazoo, Michigan. Follow her on Twitter @KatelynKivel.

 

Leave a Reply

Your email address will not be published. Required fields are marked *