Trolling is defined as writing something with the intention of provoking an angry or emotional response out of your target — and, in the process, attracting attention and reframing a debate in a way that is more favourable to one’s own viewpoint.
Within reason, trolling is not hate. Unfortunately, given the power that anonymity gives to trolls, invariably their behaviour becomes irresponsible. In order to get an emotional response out of the victim, trolls target their weakest point which invariably is their race, religion, gender or sexual orientation. When this happens, trolling crosses the line into online hate.
Anonymity also encourages them to irresponsibly attack and try to damage the professional and/or personal reputation of their victims. Such behaviour can amount to cyberbullying and OHPI combats such behaviour too.
You can also access the reports and briefings published by OHPI on the subject here.
Over and above, we also share external resource materials with our Facebook supporters. These include research findings, news and opinion articles, and resources on how to fight trolls. Below is a list of all the articles shared by us on Facebook on the subject. (Some of these overlap with online misogyny and cyberbullying, as trolls are often responsible for such behaviour). You can read the articles and the discussions that followed the posts.
Vice Magazine has written a wonderfully helpful article on trolling, cyberbullying and what you can do to fight it. It includes an interview with theOnline Hate Prevention Institute‘s CEO Dr Andre Oboler and also mentions our online hate reporting system, FightAgainstHate.com, calling it an “accountability system” where social media platforms are held accountable. You can follow our Facebook discussion here.
In this article, Brianna Wu, the head of development of game designing company Giant Spacekat, discusses why the law enforcement needs to take online threats of violence more seriously. As she says:
“Tech journalist Peter Cohen quite correctly called the actions of Gamergate “emotional terrorism,” the idea being to intimidate, bully and silence anyone speaking out for diversity in games until they quit.” You can access our Facebook post here.
“Just for lulz” maybe one of the most irresponsible terms ever invented. It has been used to explain away all forms of cruel commentary on social media.
Here’s a song that uses comedy to shine a mirror on what kind of unempathetic people we are becoming. You can read our Facebook discussion here.
Twitter has made it easier for people to identify and block known trolls. It now allows people to share their list of blocked Twitter users with others. You can access our Facebook post here.
Just out! Australian cartoonist David Blumenstein has an excellent comic out today with the story of Ben Garrison Cartoons, Online Hate Prevention Institute‘s Cartoonist in Residence and his six year struggle against online trolls.
The story includes an expose about OHPI’s role (mostly) behind the scenes in supporting Ben. Now it’s out there, we’re also like to publicly thank Big Man Tyrone for his support, which he was very happy to provide once he heard the real story behind the trolling of Ben.
You can access our Facebook post here.
Here’s an article on how trolling and international diplomacy have become interconnected. It doesn’t relate to online hate directly, but is interesting and has a good definition of who is a troll.
“To troll, by definition, is to write something that provokes a target into an angry or emotional response — and, in the process, to attract attention and reframe a debate in a way that is more favorable to one’s own viewpoint.”
You can access our Facebook post here.
Social media has given us a great weapon to vent. Whenever we feel our concerns are not being heard, we use the social media to amplify our voice and gain support for our cause. A recent book “So You’ve Publicly Been Shamed” looks at how such vigilantism can have completely disproportionate effects on the life of the accused.
But given the quicksilver nature of social media and the fact that once a piece of information goes viral, it is very difficult to retract or control, we should be careful about what we say and whom we accuse of wrongdoings.
The above article looks at how a woman wrongly accused a man of being a pedophile on her Facebook page, leading to the man receiving death threats.
We should understand that online vigilantism of this kind is a form of cyberbullying.
Yes, social media is a great tool to reach out to people. But let’s use it carefully and appropriately.
You can access our Facebook post here.
Twitter is testing an algorithm that will hunt down trollish Tweets.
We find that moderation is a tricky exercise, because language is so alive, pulsating and evolving. We need judgement to decide what constitutes offence. There are so many slip ups even when human beings make such decisions, that leaving it to algorithms is bound to lead to mayhem.
Still we are glad that Twitter is finally taking the issue of abuse on its platform seriously.
You can access our Facebook post here.
Quora’s trolls-gone-wild state is a perfect example of why user based websites need to change the way they think about targeted users – finds this article, quoting several examples from the website.
We couldn’t agree more with this author. User-based websites have a responsibility towards their users. They should not allow abuse, harassment and threats to specific target groups go unchecked. You can access our Facebook post here.
Even the President of United States of America is not immune to cyber harassment. When President Obama made his Twitter debut last week, he was inundated with hateful and racist Tweets.
We hope that the harassment faced by President Obama will throw light on how easy it is to target people on Twitter with abuse and harassment. It is a problem that has lately come into sharp focus with Twitter making a several changes to its reporting and harassment management strategies. Read our briefing: http://ohpi.org.au/steps-taken-by-twitter-to-combat-abuse/. Read our post here.
A Study by a group called Women Action Media reveal some of the problems people face when reporting harassment to Twitter. Some of them, we have already pointed out before.
Here’s some issues they observed:
– Tweet and delete: People who tweet harassing messages and then delete it, so no web address can be found to report.
– dog piling: lots of accounts harassing a person simultaneously.
– False filing and reporting trolling: people putting false reports
Twitter has been working on improving its responses to harassment. We recently compiled the steps it has been taking to fight the problem (some of which address the problems discussed above), but of course there is still a long way to go. http://ohpi.org.au/steps-taken-by-twitter-to-combat-abuse/. You can read our Facebook post and discussion here.
Online misogyny is rampant on social media. Here are six social media accounts that are fighting misogyny and online harassment. You can read our Facebook discussion here.
Over the past two months, Twitter has taken considerable steps towards improving its moderation and reporting policies. For OHPI, the steps couldn’t have come sooner.
This article by an Australian researcher discusses the steps it has taken towards better policing of the platform. You can follow our Facebook discussion here.
Twitter has rolled out a new reporting tool that allows the user to generate an automatic report of the harassing Tweets for the police. It seems it isn’t quite operating in Australia (we checked), yet, but it is something we can look forward to in the future.
It is just one of the steps being taken by the social media platform to counter the rampant harassment and abuse that takes place on it. You can follow our Facebook discussion here.
Monica Lewinsky gives a TED Talk on #Cyberbullying. She uses her own experience to discuss the cruelty of online shaming and public humiliation and calls for higher emphasis on empathy and compassion.
As she says. “Online we have a compassion deficit and an empathy crisis.” We agree. You can follow our Facebook discussion here.
UK is cracking down on Internet trolls. In the last three years, it has charged or cautioned more than 6000 trolls. You can read our Facebook discussion here.
This article interviews two convicted trolls, who have since been released and now admit that trolling is wrong.
As their interview indicates, they had nothing specific against their victims. They just sent a torrent of abuse their way because the topic was trending and they were bored and drunk. It shows how open the platform Twitter is to abuse.
We feel that the platform should take more responsibility towards protecting its users, particularly when they are profiting from user-generated content. You can read our Facebook discussion here.
Why do people troll? This article speaks to a few trolls to find out the answer and different themes emerge. But what ties different trolls together is their apparent lack of empathy towards others. You can read our Facebook discussion here.
Internet trolling with the express purpose of spreading hate is some of the most controversial aspects of online culture. In Sweden, a group of journalists have taken on made it their mission to expose trolls and their ugly shenanigans. You can follow our Facebook discussion here.
TrollBusters is a new app to identify trolls targeting and harassing women. For our Facebook discussion go here.
No one can deny the success with which ISIL and similar extremist groups have used social media to spread their hateful message and gain recruits. Can trolling ISIL on social media be an effective way to counter their hate messages? This Bloomberg article discusses the different trolling alternatives that can be used.
However, before we jump into trolling ISIL consider this viewpoint.
“Jayne Huckerby, an associate professor of clinical law at Duke University who advises institutions on countering violent extremism, warns that reaching the right audience with the proper countermessage is hard. She is particularly concerned about people sending Islamophobic tweets or online messages, which may simply contribute to the feeling of alienation that has driven many recruits into Islamic State’s fold. A single meme thrown into the pro-Islamic State echo chamber might do more harm than good.” You can follow our Facebook discussion here.
In an unusual move, a US family, whose murdered daughter was being impersonated by online trolls, has trademarked their daughter’s name to take legal recourse against online trolls.
A form of cyberbullying involves online trolls impersonating dead people on social media to harass their family. Often, the content posted is offensive and hurtful. Usually, the victim is someone whose death was reported in the media for some reason. In this case, the victim was one of the teachers murdered during the Sandy Hook school massacre in 2012.
It is sad that in this case, the family has been forced to take the extreme measure of trademarking their daughter’s name. It shows the failure of social media platforms to take responsibility of how their tool is used.
A recent article in the Guardian “What happened when I confronted my cruellest troll” described trolling as gratuitous online cruelty.
“Trolling is recreational abuse – usually anonymous – intended to waste the subject’s time or get a rise out of them or frustrate or frighten them into silence. Sometimes it’s relatively innocuous (like asking contrarian questions just to start an argument) or juvenile (like making fun of my weight or my intelligence), but – particularly when the subject is a young woman – it frequently crosses the line into bona fide, dangerous stalking and harassment.” You can read our Facebook discussion here.
A leaked Twitter memo reveals that the Twitter CEO Dick Costolo plans to start fixing the problem of trolling and abuse on the platform. About time, we say. You can read our Facebook discussion here.
OHPI has said repeatedly that online hate destroys freedom of speech by silencing segments of the community and forcing them off line. This has an impact on their ability to participate in modern society, and on their ability to play a role in democracy. When making this point we usually refer to cyber-racism. The article linked to here makes this same point about cyber-bullying. You can read our Facebook discussion here.
Trolls, hate speech and comment threads: A South African news portal explores the problem, possible solutions and the issues they face with comment moderation. You can read our Facebook discussion here.
As the author writes: “Nothing is out of bounds, no matter how malignant: Children tell each other to kill themselves, mothers ridicule other parents’ toddlers, racists rant, misogynists send rape threats, and anyone who dares to hold an opinion is often targeted — not with evidence-based argument, which is the exercise of free speech, but in hate-filled, grammar-less personal attacks punctuated with profanity. Never before has the complexity — and simplicity — of the human mind been more on display than in social media.” You can read our Facebook discussion here.
Guardian’s done a great article on trolls in the wake of the PewDiePie controversy. PewDiePie is the world’s most famous youtuber who has decided to ban trolls from his platform, leading to a storm over freedom of expression. You can read our Facebook discussion here.
This free Facebook App searches its user’s Facebook friends for people who are using phony identities, commonly called “Catfish.” CatfishNet not only finds the Catfish but it also helps keep predators from remaining “friends” with the user. You can read our Facebook discussion here.
Is there a right and a wrong way to reply to a racist tweet? This Daily Beast writer thinks there is.
“Issuing threats to a kid who made a single dumb racist comment doesn’t make you better than the kid. It makes you doubly worse, since you’re using a moral standpoint to justify an immoral action. Our anger and outrage shouldn’t dictate our public response, especially when using platforms like social media that anyone else can see.” You can read our Facebook discussion here.
A great article on why Australia needs tougher laws on cyberbullying. As the author points out, “the current system fails to provide victims with what they need most: speedy and effective removal of offending material from cyberspace. Quick access to such remedies is best way to begin repairing the damage, where even the shortest delays can have the direst consequences.” You can read our Facebook discussion here.
What is the role of anonymity in spreading extreme thoughts and behaviour? A research on online misogyny on anonymous college online boards find that while anonymous message boards are not directly responsible for sexist culture they have become an important part of a larger system that normalizes misogyny. For students who contribute to and frequent the boards, sexist language becomes commonplace, shaping group styles and mapping the contours of acceptable rhetoric. You can follow our Facebook discussion here.
Here’s an interesting perspective on the free speech vs hate speech debate.
A research found that social media users fight online policing by platforms until they imagine themselves to be victims of abuse and cyberbullying. Then they start supporting greater interventions by the platform in their social media interactions.
Many people think that online misogyny is just a harmless bit of fun. But in this article, a woman journalist gives a low down on how threatening, vile and intimidating online misogyny can get.
Does it sound like just some harmless bit of fun? Would such words uttered in an offline environment be deemed “fun”? They would be considered unacceptable. Yet, they flourish unchecked online in the name of free speech and freedom of the Internet. You can read our Facebook discussion here.
Ask.fm has become one of the most popular social media sites with teenagers around the world. But with its anonymous nature, it has also become notorious for the cyberbullying that takes place in it. Some reports suggest that nearly 16 teenage suicides have been triggered by cyberbullying on the network.
In this interview, the brothers who started and run the website, give their side of the story. According to them, bullying is a societal problem not an Internet problem. And that anonymity encourages a lot of people to share what they would otherwise not share.
What we are thinking is at what point does anonymity reach a tipping point – when do the drawbacks overtake the benefits. Do social media ever have a responsibility towards its users’ safety? Can any organisation ever safety-neutral towards its users?
Trolls are attempting to create disharmony among the progressives through hashtags and stolen identities. Guardian reports. You can read our Facebook discussion here.
Yesterday The Age published an important story on online trolling. OHPI received a brief mention where we highlighted the need for greater police attention in this area. Read our Facebook discussion here.
What’s the best way for societies to deal with trolls spewing venom all over the internet. a) jail them b) ignore them c) engage with them and try to change their minds or d) unmask them? This article gives some guidance. Read our Facebook discussion here.
The tragic passing of TV personality Charlotte Dawson has in some ways highlighted the issue of online or cyber-bullying. We will continue to advocate for social media companies to do more to protect people against online bullying and trolling, encourage governments to creating better legislation in relation to cyber bullying, and continue to educate people, young and old, about recognising and responding effectively to online bullying. You can read our Facebook discussion here.
OHPI report discusses by SBS news – have a read and share.
You can access our Facebook post here.