YouTube have just made three important changes to their policies to strengthen their efforts against hate. These moves are warmly welcomed by the Online Hate Prevention Institute which has noted the use of YouTube as a preferred channel for the promotion of certain types of hate, in part due to YouTube’s unwillingness to remove such content. We’ll wait to see the practical effects of these policy changes, but they have the potential make a significant difference.
We also welcome YouTube researching out to us after the changes were made. The e-mail from YouTube says in part:
Today, we’re taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.
We appreciate your expertise and engagement with us around these issues — and look forward to continuing the conversation.
While the e-mail also lists previous efforts by YouTube to tackle online hate speech, as we’ve just discussed with SBS News, these steps had many gaps, some of them glaringly obvious. In other cases while the policy was there, the implementation or willingness to follow through based on the policy was lacking. We hope the new changes, as well as the way they are implement, significant change this.
YouTube is a private company. The First Amendment means the US government can’t make a law forcing YouTube to remove hate speech, but neither can it make a law forcing YouTube to publish such content. OHPI’s CEO, Dr Andre Oboler, explains:
The choice is up to YouTube and the law in many places outside of the United States, and public opinion around the world – including in the United States – has shifted and no longer sees hate speech as something which should be tolerated. We’ve seen how such speech spreads the poison of hate, we’ve seen how it can sometimes lead to incitement to violence and then to acts of terrorism. While people still have freedom of opinion and freedom of speech, giving them the power of social media to spread that hate is a step to far.
The new rules will close some of these gaps. It has for example been YouTube Holocaust denial videos which we have been using when demonstrating our hate reporting tool at the meeting of the International Holocaust Remembrance Alliance this past week. The reason for that is that they have just been so easy to find. Under these new rules, we hope to see that change.
We look forward to further discussion with YouTube on ways to implement their policies that will see them succeed. These are positive steps, but we need to see them actually taking effect.
You can share your thoughts and comments in reply to this Facebook post.