Address to the United Nations

On Friday 20 November our CEO, Dr Andre Oboler, served as the panelist for Civil Society in the UN’s 13th Session of the Forum on Minority Issues. The session this year focused on “Hate Speech, Social Media and Minorities”. A clip with Dr Oboler’s presentation is presented below. The full 2 hours can be seen on UN TV.

Dr Oboler’s Address

(Recommendations are in bold)

In considering how we address online hate, one thing is clear: wishful thinking is not enough.

 Neither are memes and videos opposing hate, positive initiatives that bring people and communities together, education programs for school, public advocacy or public education.

Counter speech is not enough. Empowering young people is not enough. These are the dominant approaches of recent years, they are what we have promoted even as the hate continued to rise.

If we want to get on top of this problem, we need take it more seriously. We need to invest both economically and through political capital in real solutions.

I am going to share a few positive response I have been involved in and highlight lessons we can draw from them.

I have provided a longer paper with other initiatives, further details and references which I believe will be available.

In 2008 Israel convened the Global Forum for Combating Antisemitism, a gathering of civil society, academics, politicians, and community leaders from around the world. It included a panel on Internet Antisemitism where I released my work on Antisemitism 2.0 and the normalisation of hate through social media.

Governments can convene forums and give a platform to civil society.

Together with David Matas from Canada I was honoured to co-chair the working group on Internet and Media antisemitism from 2009 until 2018. Each time we met we compiled a report from participants input giving a snapshot of the state of play.

With the right structure, a broad range of experts and secretarial support, a solid picture can be created even as technology and threats rapidly evolve.

A key challenge listed in 2009 was the lack of metrics or even an approach to measuring antisemitism in social media. In 2011 a proposal was put forward for software to crowd source data the Working Group provided feedback and discussion.

Defining grand challenges spurs work on solutions.

In 2012 we took a project focused on online antisemitism and turn it into a standalone civil society organisation, the Online Hate Prevention Institute, with a remit to tackle all forms of online hate. There was ground break work in 2012 on “Aboriginal Memes and Online Hate”, in 2013 on “Islamophobia on the Internet”, work on antisemitism identified blind spots in platform policies.

Much of what we learn tackling one form of online hate is equally applicable to other forms of online hate. Supporting all minorities makes methodologies and tools more effective more quickly.

It takes deep technical expertise to do this work, particularly to keep up with new platforms and technological changes. There are a real skills gap. We can’t expect every minority community to fill the gap, we need dedicated civil society organisations with the appropriate skills and a broad perspective working for all.

We need collaboration between specialist civil society organisations and the people who live the reality of the online hate and the organisations that represent them.

The software we built used in one study of over 2000 items of antisemitism, and another of over 1000 items of Islamophobia, highlighted significant gaps in enforcement by platforms.

It highlighted the need for reporting outside the platforms so data could be shared between experts.

To engage in tackling online hate, local civil society organisations need access to sophisticated reporting systems. They also need staff with the time and expertise to engage on the topic of online hate.

Social media platforms are certainly profiting from online hate, I know this simply from the amount the Online Hate Prevention Institute had to pay to ensure our content addressing this problem gets seen. This is perverse. Specialist civil society organisations working on online hate are not only acting as janitors for these mega corporations, we are paying them for the privilege.

There is a shortage of deep civil society expertise in online hate and a huge demand. We need plans and resources to increase civil society’s capacity.

On the topic of fake news, misinformation and disinformation, hate is often based on ignorance and ignorance is compounded with misinformation and insinuation.

The history of antisemitism highlights this, from Blood Libels to the Protocols of the Elders of Zion to Holocaust denial.

Governments are increasingly adopting the IHRA Working Definition of Antisemitism, it is rapidly becoming a de facto standard. Those who have not adopted it should do so.

Social media companies should clearly and explicitly ban both antisemitism and Holocaust denial. They should commit to using the IHRA Working Definition of Antisemitism and the IHRA Working Definition of Holocaust Denial and Distortion as tools to assist them in recognising and responding to online antisemitism.

Similar definitions and broad consensus are needed on the hate affecting many other minorities. The work on antisemitism provides a model that should be applied to protect other communities as well.

IHRA itself has recently adopted a Working Definition of anti-Roma hate, again something for Governments and Social Media Platforms to adopt and put into practise.

There is much more work to be done to protect many more minorities, but a trail has been broken and there is a path ahead of us.

Detailed Paper behind this talk

The talk presented to the UN is based on a paper prepared for the Forum which has been shared with the secretariat. A copy of this paper can be downloaded.