The CEO of the Online Hate Prevention Institute, Dr Andre Oboler, presented to a regional forum on October 20th 2020 and outlined four key recommendation for addressing online hate speech to create a safer space for minorities.
Asia Pacific Regional Forum on Hate Speech, Social Media and Minorities was a 2-day online event convened by the Tom Lantos Institute and United Nations Human Rights Special Rapporteur. The Forum brought together experts & specialists from human rights institutes & organisations, civil society, social media platforms and other stakeholders from the Asia Pacific region.
Insights gathered from the Forum are to be fed into the thematic work of the Special Rapporteur on minority issues for his report to the 46th session of the UN Human Rights Council in March 2021.
“Towards a safer space for minorities: positive initiatives to address online hate speech: The role of NHRIs, human rights organizations, civil society and other stakeholders” was the final thematic session of the Forum. Dr Andre Oboler presented alongside:
- Mariya Salim – Independent activist, writer and researcher & Former Mentor and Fellow, Minority Fellowship Program OHCHR
- James Gomez – Regional Director, Asia Centre
- Kathleen Reen – Senior Director, Public Policy for Asia-Pacific, Twitter
The session was moderated by: Taisuke Komatsu – Under-Secretary General, the International Movement Against All Forms of Discrimination and Racism (IMADR)
Presentation & Recommendations – Dr Andre Oboler
Historic Overview: In 2003 I noted that online hate speech was being circulated on university campuses. The hate speech took the form of printouts from websites which some student groups used as political activism, not realising the antisemitic nature of some of the content or of the sites they had sourced the material from.
In 2007 I noted Facebook, then a young upstart seeking to complete with myspace, being used to spread hate speech. Dedicated spaces on the platform promoted hate while also asserting they were against hate and in compliance with platform policies. Often these statements occurred side by side.
I pushed a paper on this work titled “Online Antisemitism 2.0. ‘Social Antisemitism’ on the ‘Social Web’” in early 2008 and presented on the topic to the Global Forum for Combating Antisemitism. Until this point, hate speech in social media was really not on the radar.
In 2009 I was appointed to co-chair the Global Forum’s working group on antisemitism on the Internet and in the Media. The first thing we did was call for an approach to measure antisemitism on social media. By 2011 we had designed an approach and software to implement it, and by 2015 we had built the system and used it to gather data, in 2016 reported on some longitudinal monitoring of platform responses.
Our work empirically highlighted the huge gap between community expectations and the online reality. It showed complete inconsistency. The effectiveness in responding to user reports varied by platform and within each platform depending on the way the hate speech was expressed.
Recommendation 1
Our focus on monitoring and transparency, making technical recommendations to platforms to improve their products, and providing in-depth explanations of why content was hate speech was different to the approach of other organisations. Some other organisations focused on education in schools, some focused on the wording of platforms policies, some focused on law reform, some focused on international agreements and standards.
RECOMMENDATION: Civil Society contributes to creating safer online spaces for minorities in multiple different ways through many different organisations and approaches. Governments, platforms and NHRIs should therefore consult, work with, and fund, a broad range of civil society organisations and approaches.
Recommendation 2
During this journey I had realised the work being done on antisemitism provided lessons that were equally applicable to combating hate against many other minorities. In January 2012 the Online Hate Prevention Institute was created in Australia as an independent charity. It’s remit was to tackle all forms of online hate and extremism.
The institute’s major works include the first report into antisemitism on YouTube in 2012, the first report into racist Aboriginal Memes in 2012, the first report into antisemitism on Facebook in 2013, the first report into Islamophobia on Facebook in 2013, and so on. We sought to work with the communities who were targeted by the hate. Our reports carry endorsements from many of them. In some cases we involved them throughout the process, consulting them regularly and sharing drafts.
RECOMMENDATION: Many minorities are impacted significantly by online hate, and while they are represented well through civil society organisations in general, online hate takes particular expertise which these organisations often lack. Civil society organisations with the expertise in online hate should be encouraged, supported and funded, to tackle hate against all minorities and to do so in partnership with other civil society organisations that represent the specific minorities.
Recommendation 3
My work began with a struggle to explain why social media was important and the impact it would have on society. I struggled to explain why the normalisation of hate in social media was as dangerous or more dangerous than the spread of hate in the media.
As time went on, the danger began to be accepted, and governments and platforms began to respond. They funded youth focused projects to deliver positive messages against hate and in support of harmony in society. As wonderful as this is to change attitudes for the future, it did nothing to tackle the hate that was spreading like wildfire. It did nothing to support the work we were doing which was now also tackling the use of social media to spread hateful extremism by terrorists.
Our young people should be engaged and empowered, they have more knowledge of social media than many politicians, but they are not trained experts in either hate speech or technology. Yes, they can help, but only in some areas. We cannot place the full burden of the problem on them, nor can we expect them to have the answers to complex problems of policy which shape the future of society or when required, the knowledge experts spend years studying to acquire.
RECOMMENDATION: Platforms and governments often urge civil society to focus on counter speech and education as a solution to hate speech. Funding is often restricted to supporting counter speech and education, with education itself often focused just on schools. This is only one part of the solution and civil society also has an important role to play in many areas. The focus on civil society engagement, and the support for civil society, should be broadened beyond a focus on counter speech and education. Work at the sharper end of hate speech, including monitoring.
Recommendation 4
Perhaps it is superfluous, but I want to make one more recommendation which perhaps adds depth to my previous three recommendations.
RECOMMENDATION: The work of civil society across many facets of tackling online hate should be recognised and supported. This includes: monitoring online hate; supporting victims of online hate; monitoring the responsiveness of platforms to reports of online hate; monitoring the response of governments to complaints about online hate; identifying new manifestations of online hate; tracking threats and altering relevant stakeholders including government; supporting law enforcement by providing data for investigations; supporting other civil society organisations by providing specialist capacities when needed; providing information and education to policy makers, platforms, educators, law enforcement, NHRIs, and others; supporting public education through programs and media engagement; and other approaches.
Summation: The Online Hate Prevention Institute has provided world class leadership in this field. We’ve moved ahead of current trends, breaking new ground. We’ve tackled issues before others were aware of them. We have for years sought a partnership with government and funding to secure the work, but we have done so without success. Our latest work has focused on COVID-19 and we’ve warned of the long-term impact, the flood of online hate in recent months will have on society and attitudes towards minorities and international relations. Even at this point of crisis, our requests for urgent support were met only with requests for us to provide more support to government for free.
Comments and Support
You can comment on this article in this Facebook thread.
You can support our work with a donation to the Online Hate Prevention Fund via Credit Card or PayPal (Australia + International) through the PayPal Giving Fund (receipts are sent automatically) or in Australia by making a direct bank transfer to Account name: Online Hate Prevention Fund, BSB: 083-088 Account: 73-337-6910. If you make a direct transfer to our bank, please e-mail us at ohpi AT ohpi.org.au so we can send you a receipt manually.