The Online Hate Prevention Institute participated in the Inter-Parliamentary Task Force on Online Antisemitism’s first meeting with experts. Our CEO, Dr Andre Oboler, briefed members of the Australian Parliament, US Congress, UK Parliament, Canadian Parliament, and Israeli Knesset on the changes to online antisemitism over recent years and the pipeline from hate to terrorism. He also answered questions about regulation, penalties and the role platforms play in the spread of hate.
Thank you to Josh and David for inviting me to present and thank you all of you for your time. This is a very important forum.
The fight here needs your leadership and there have been a number of forums moving forward over the years, I have been involved in quite a few of them with Rabbi Cooper and others who are here as experts, and the world has changed since then. There internet has changed since then. The actual culture we are dealing with, the position of the platforms, and most importantly the position of the public and the attitudes of what we’ll accept, are different now to what they were then. This creates new threats and new opportunities for us.
In 2008 at the Global Forum to Combat Antisemitism I first raised the concern about antisemitism in social media. What I called antisemitism 2.0. And the danger even then was already highlighted as the normalisation of the hate, both in the online communities, then reflecting back into society.
What we’ve seen is that this online hate has become a pipeline into radicalisation. We’ve seen how neo-Nazis manipulate on their forums, I’ll mention Stormfront in particular, not only promoted hate themselves, but then infiltrated other forums. It was Stormfront that infiltrated 8Chan and 4chan, particularly /pol/ that led to terrorist attacks that Justin just spoke about. There was a deliberate effort of radicalisation. There was a change in culture that was pushed from the outside by neo-Nazis.
The attacks, again I won’t repeat them, Justin mentioned them, but the most recent one in that series which was the attack in Halle in Germany was covered in a report that the Online Hate Prevention Institute put out just before Christmas, a tip for all the other experts, don’t realise a major report the day before Christmas, nobody sees it, but that report is out there and has a large number of recommendations in particular for members of Parliament, so I commend the report to you, please do look through it. It also has recommendations for the major platforms and a number of them engaged with us as we put it together.
One of gateways to this hate that we’ve seen has been Holocaust denial, and I want to flag that in particular. It’s already been mentioned by others about Facebook’s change, but people may not be aware, just in the last 24 hours there’s been a letter to Jack Dorse put out by the American Jewish Congress, another organisation I’ve been assisting, highlighting the flaws in Twitter’s policies. Highlighting actual examples of content that are falling between gaps. Highlighting how the policies Twitter have are actually allowing radicalisation networks to continue sharing content in private there.
The hate groups are moving from mainstream platforms to smaller platforms, they are moving to fringe platforms while at the same time continuing to use the major platforms through coded messages, through links, through other resources that drag people in this pipeline to extremism. The AI is not going to pick up these things because of the way they are coded and hidden. It takes the work of experts. It takes the connection between our experts and our Members of Parliament, our Members of Knesset, our Members of Congress, in order to make sure that we don’t monitor things and say, ok if we get 90% of it, 99% of it, we’re doing enough. That 1% is a large volume of content, and that 1% is it radicalises 1 person and leads to an attack, that’s what we need to avoid.
Our time is up for Australia, so I yield, thank you very much.
The video above continues with the Q&A session which followed.