Rebutting The Conversation on online harm

We don’t normally publish replies to articles in the press, but a recent article in The Conversation” is so out of date in both its facts and its thinking that we felt the record had to be set straight. Frankly we expect more from The Conversation. This reads like a sponsored advertorial about the trust we should place in big business.

Here are some of the problems with this article:

“The government intends to create an “e-safety commissioner”, which will have powers to ask (but not order) organisations such as Facebook to removing bullying content targeted at Australian children.”

The Government’s e-Safety Commission will have the power to fine major social media companies $17,000 per day of non-compliance with orders to remove specific content.  That’s more than just asking, and rightly so. OHPI was involved in consultations over the development of this policy and fines for non-compliance were always the intention.

Jokes about rape, for example, are often allowed on Facebook, but photographs of breastfeeding women have in the past been banned for violating the standards.

Facebook reversed it’s position on breast feeding back in June; they now allow it in photographs. Back in May 2013 Facebook apologized for the position they had held for 2 years which allowed “rape jokes” to remain on the platform on the basis they were “funny”. While there was a time when breasting feeding was banned and content promoting rape was permitted, that time was not in December 2014. These facts are well known to all who are engaged in the online policy debate, its a wonder the conversation published this article with such a flaw.

There are problems with making intermediaries such as Facebook legally liable for content posted by their users. It can be unfair, because they’re often not really at fault. It creates real uncertainty and risk, which can decrease investment in new services and technologies — or push them to move offshore.

It’s hard to know where to start with this. Perhaps with the basic facts? The large social media companies ARE off shore, this is how the social media industry works. They pay next to no tax in Australia, avoiding GST on millions of dollars of advertising revenue they make, and employ next to no staff. Complaints made by Australians are not assessed by staff here in Australia based on Australian values. There is therefore no risk that regulation would push the companies off shore. If anything, social media has been totally unregulated, unlike every other sphere of society. They have been on the gravy train, no tax, no regulation, lots of revenue. The article is right that there is a growing public expectation that more be done to combat problems in social media, but it is wrong to try an absolve the companies from all responsibility.

There was a time, some years ago, when one could argue (as this article does) that social media companies had an ethical responsibility to remove hate but not a legal one. There was a time when one could argue the industry was new, and any regulation would prevent innovation and progress (as this article does). That same argument was used against regulation of traditional industry as it produced pollution which made people sick. It is still used in some developing countries today. This is not an argument when it comes to a company like Facebook whose value cross the $200 Billion point in September this year, making it one of the world biggest companies.

The public has a right to demand that social media companies take reasonable efforts to reduce online hate, cyber-bullying and other online content which can harm individuals and society. Governments have a right to legislate obligations and minimum standards. Yes, this will eat into the companies profits, just like filters and other pollution controls eat into the profits of traditional industry, just like the cost of fact checking eats into the profits of news organisations. These are the costs of doing business. The richest companies in the world may wish to have no regulation, no taxes and no obligations. They may wish for us to extend them good will because they are innovators. That line of of argument wears thin fast when those companies are pulling their weight, doing what they can to minimize the negative effects of their business, and are not contributing to society through taxation.

The article raises the concern that holding companies liable for content users post may negatively impact freedom of speech by making the companies too responsible. This seems highly unlikely when there very business model relies on speech. The issues is not dissimilar to journalism where the paper and the journalists can be sued for defamation, it doesn’t stop the news going to print. We don’t suggest fining companies for posts their users make, but we do suggest holding them responsibly if they fail to provide a reasonable reporting system so other users can bring problem content to their attention. We also suggest penalties if the companies fail to remove content that is unlawful in a reasonable time frame.

The article argues that:

In creating rules about acceptable content, we need to be careful to take account of the legitimately different views and expectations of the diversity of human beings who use these services to connect and share content.

We agree. This is why the rules of self regulation, such as community standards, need to be clear and consistently enforced. It also why the law needs to be clear and consistently enforced. If a platform is notified by users that certain content appears to be unlawful, and they decide not to remove it, that is their right. That right comes with the obligation to pay the price if they get it wrong. We don’t suggest every case need to go to court, but the government should have access to complaints which have been rejected by a company where the content remains visible within their borders. If they choose to apply fines, the companies must have the choice of either paying the fine and removing the content, or fighting the allegation in court. If a court rules the content was not unlawful, we are one step closed to clarifying the law and to knowing what is and is not permitted.

Lawrence Lessig warned in 2000 that we are “so obsessed are we with the idea that liberty means ‘freedom from government’ that we don’t even see the regulation in this new space”. He was speaking about internet and the regulation that takes place through software. Today this regulation is under the control of some of the worlds largest companies. Only regulation by government, with laws put in place by those we elect, and applied and interpreted by the courts, can server as a counter weight. We if we handle the speeding ticket problem, where only the exceptions need actually take up the time of court, then so too can we handle the problem of online abuse with only the exceptions ending up before a judge. The field of technology is wonderful, but being good at technology does not make one good at governance or justice. We have inherited a tradition of democracy and the rule of law. Let’s not throw that out in favour of regulation by, and at the discretion, of private companies.

As a final thought, social media does change the act of publishing. An individual puts content online, but it is the social media platform that first provides the mechanism to take the content viral, and then refuses to take it down when users start to complain. Social media companies are also the only people making money out of this expression. That  money comes from advertising that appears along side the users content. Social media companies are not like telephone companies, strictly neutral in the conversation. They are an actor with a stake in the content, and financially speaking, usually the biggest stake of all. We need to stop treating them like fragile magical toys and start treating these multi-billion dollar companies like the businesses they are. They have the teams of lawyers, policy advisers, political lobbyists and strategists. They have a virtual monopoly, each in a slightly different sphere, and they have so far avoided significant regulation while lives have been destroyed and people have died.

The companies seek to take as much advantage as they can, with as little regard for the public as they can get away with. This is not cruelty, it is what companies are here to do; to return value to their shareholders, not to advance the public interest like a charity. All that stands in their way are our demands for social responsibility and the ability of those who represent us to regulate in the national and public interest. Let’s not give the companies a free ride by focusing only on moral responsibilities. Those who cause harm to the public while profiting must share the burden of limiting the harm they cause – specially when they are in a position to directly limit that harm. That’s all we can ask for, and we must demand nothing less.