
On August 11th 2025 our CEO, Dr Andre Oboler, presented on a panel at the G20 Interfaith Forum in Cape Town, South Africa, on the topics of “Positives and Negatives of Modern Communications: Addressing Disinformation and Related Social Tensions”.
Panel Description:
Social media is revolutionizing community relations, including within and among religious communities. This session will explore its power for good, reinforcing knowledge and social cohesion, but also the potential disinformation and divisiveness. How are religious communities affected? What actions are being taken or could be taken to enhance media understanding and literacy and to counter the ill effects of other negative manifestations? Where are there tensions on how the problem is defined, recognized, and responded to, and how are these tensions specific to religious communities?
Moderators:
- Ambassador (Rabbi) David Saperstein, Senior Adviser on Strategy and Policy, Union for Reform Judaism, United States
- Sara Gon, Director, Free Speech Union of South Africa, South Africa
Panelists:
- Dr. Andre Oboler, CEO, Online Hate Prevention Institute, Australia
- Liesl Pretorius, Head of Legal, Freedom of Religion South Africa, South Africa
- Lynn Swain, CEO, Symbiota Leadership Institute, South Africa
- Yashika Singh, Head of Religion Genre Content Hub Video Entertainment, SA Broadcastin Corporation-TV Division, South Africa (Manager, Religion Department for SABC Television)
- Sibu Szymanowska, Co-Founder, The Hybrid Tours, United States
- Rosalind I J Hackett, Chancellor’s Professor Emeritus, Professor Emeritus, Department of Religious Studies University of Tennessee, Knoxville, USA/ Extraordinary Professor, Desmond Tutu Centre for Religion and Social Justice, University of the Western Cape, South Africa
Dr Oboler’s Remarks
I’ve been tackling hate online for over 20 years and on social media specifically for the last 18 years. That was a time when Facebook was fairly new, growing rapidly, and trying to catch up to myspace.
Since the start, social media has had a flawed model. People’s personal data and their attention is a commodity. Social media users are the product, not the customer.
It is increasingly hard to get content seen on social media unless you become an actual customer, paying to promote it. There are two exceptions to this:
- Content which farms outrage and other extreme, usually negative, emotions.
- People who actively engage with your content may see more of it. This works for established communities but may not help them grow.
In the remainder of my remarks I will focus on the content that spread virally because it generates anger, outrage, and hate.
Online Hate
It is a form of pollution of the social media industry. Like pollution from other industries, those that profit from it should be responsible for cleaning it up.
Like other forms of pollution, cleaning it up costs and reduces profitability. It only becomes cost effective if the alternative, such as government fines, is worse.
Europe has led the way in this space, and the Digital Services Act (DSA) is the latest effort to reform the industry. Some countries, including my own, are restricting social media use to those over 16 to limit its harm.
For a time, companies were working to make social media safer. Trust and Safety teams were growing. Investment was occurring into AI moderation. Since Covid we have seen a significant pulling back, and this has been accelerating.
It is harder than ever to get action taken even on clearly harmful content that promotes hate. Religious communities are increasingly the targets of this hate. It is being normalised online and leading to real world violence.
Old propaganda is recirculating. This is particularly true of antisemitism, but also of Islamophobia.
Echo chambers of normalisation
What’s most concerning is that people are in echo chambers where hate is normalised.
People who used to be involved in interfaith activity and building bridges are now on the frontlines of moral outrage, posting content that is so extreme, those they used to work with simply can’t engage with them.
A dialogue to recover is very hard when hurtful is continues to be posted. Much of that hurtful content is hate propaganda, people re-sharing it without reflection. When the nature of the content is pointed out, people get defensive, and their echo chamber reassures them it is ok.
We’re building a society of isolated and heavily defended communities. A return in many ways to the dark ages, where at least some of the hate originates.
My organisation tackles all forms of online hate. Our response is to measure the hate over time and across multiple platforms, so we can at least understand this dangerous environment better. Only then can we really discuss it and start to address it.
We’ve seen a surge in content attacking Jews for killing Jesus. We’ve seen dehumanisation of Muslims dramatically increase. We’ve seen attacks firebombing places of worship dismissed as “false flags” or the targeted communities themselves blamed for the attacks.
Faith communities are not unique for being targeted. We see the same occurring to other communities, to women’s groups, to minority ethnic groups, racism is rife.
When it comes to faith communities, however, there is a long history of disinformation to draw upon. A long history of content used to incite persecution. This content is being revived.
I spoke on disinformation in 2023 to a technology audience and the entire talk was on the blood libels and their reemergence online. Historic church art and sculpture depicting allegations of blood libel were incorporated into YouTube videos and presented as evidence the blood libels were real. We are dealing not just with the failings of new technology, but also with the revival of historic lies designed to spread hate and justify religious persecution.