news.com.au Staff, “The Virginia shooting exposed the flaws of social media, as Facebook and Twitter rely on users to report violence, not patrol for crime clues“, August 28, 2015, news.com.au. (http://www.news.com.au/technology/the-virginia-shooting-exposed-the-flaws-of-social-media-as-facebook-and-twitter-rely-on-users-to-report-violence-not-patrol-for-crime-clues/story-e6frfrnr-1227502880087) 

THE world’s biggest social networks do not actively patrol for signs of impending violence or terrorist acts, even though they’re often the first mediums to receive it.

Instead, with almost two billion active users between them, Facebook and Twitter rely on users to report violent threats, abuse, or hate speech, leaving internet experts calling for “urgent reform”.

Plus, new figures show violent content may not be removed even after it has been reported, with posts still on Facebook five months after they were reported, and 86 per cent of violent content still on Twitter.

Former Virginia journalist Vester Lee Flanagan, who shot and killed two of his former colleagues during a live broadcast this week, uploaded video of the murders to Facebook after the crime, and posted complaints about his victims to Twitter.

Lives lost ... Reporter Alison Parker and Adam Ward were shot while doing a live report i

(Lives lost … Reporter Alison Parker and Adam Ward were shot while doing a live report in Moneta, Virginia. Source:Facebook)

Both accounts were suspended within minutes of the videos’ appearance, but only after the footage had reportedly been shared as many as 500 times on Facebook.

Social media has also been used to flag other crimes, including New York police killer Ismaaiyl Brinsley, who posted a photo of a gun on Instagram with a caption about “putting wings on pigs” last year, and British soldier killer Michael Adebowale who talked about his “intent to murder a soldier in the most graphic and emotive manner” before following through with his threats in 2013.

 

Facebook does not actively look for abusive or threatening messages, photos or videos, however, instead relying on reports from other users.

The network hosts more than 1.44 billion users monthly, with one billion or one in every seven people in the world logging on this Monday for the first time.

Users are encouraged to report abusive posts with the drop-down menu beside them, using the option “I don’t think it should be on Facebook”.

Shocked ... Kimberly McBroom, the station’s morning anchor reacts to a live broadcast dur

(Shocked … Kimberly McBroom, the station’s morning anchor reacts to a live broadcast during which reporter Alison Parker and cameraman Adam Ward were fatally shot. Picture: WDBJ-TV7 news via AP Source: AP)

Facebook has also come under fire for refusing to hand over users’ details to law enforcement agencies, though a New York court forced it to hand over details in a social security fraud case last month.

Social network Twitter also relies on user reports to identify abusive tweets among the 500 million messages sent on its service daily, though the company recently introduced an option to report “abusive or harmful” messages.

Memorial to the fallen ... Dale and Edith Bryant, of Botetourt County, Virginia., look ov

(Memorial to the fallen … Dale and Edith Bryant look over a memorial for the two slain journalists in front of the studios of WDBJ-TV7 in Roanoke. Picture: AP Source: AP)

Online Hate Prevention Institute chief executive Andre Oboler said social networks relied on other users to report violent content due to the “volume of content” they encountered, but even when items were reported they were rarely taken down promptly.

“There is a need for urgent reform by the social media companies, and if they don’t start to do better on their own, its likely governments will step in,” he said.

“The real problem is that once an item of hate is reported, the most likely outcome is that report will be rejected and the content will remain online. There is no transparency on the social media companies’ complaint-handling systems.”

Mr Oboler said videos of the shootings in Virginia would only have been taken down in response to media coverage rather than “going through the usual internal processes” for threat reports, which were “ineffective”.

Virginia memorial ... From left, Amora DeVries, 7, Afira DeVries, Eva DeVries, 9, and Bre

(Virginia memorial … Amora DeVries, 7, Afira DeVries, Eva DeVries, 9, and Brenda Keeling at a candlelight vigil in front of the studios of WDBJ-TV in Roanoke, Virginia. Picture: AP Source: AP)

He said new research from the OHPI showed most content promoting antisemetic violence remained online five months after it was reported to the social network that hosted it.

Half of the reported content remained on Facebook after five months, he said, 74 per cent of the content remained on YouTube, and 86 per cent stayed on Twitter.

Despite the statistics, online threats could be one of few cues delivered by disturbed individuals, according to Sydney forensic psychiatrist Dr Julian Pamegiani, who said they were often loners who were aggressive and difficult, with few friends or close family connections.

“They are loners. That is the key,” he said. “Their biggest problem is lack of personal insight.

“People who act out always blame someone else. If they’re fired they don’t recognise it’s because of something they’ve done. They blame everyone else, they see the world as a hostile place.”

Dr Parmegiani, who has worked with serious offenders and in correctional facilities, said employers who dismiss these types of individuals should warn the targets of any threats, warn police and they can take out apprehended violence orders which restrict their access to firearms.