Facebook’s New Reporting Tool

Facebook have updated their image reporting tool. The new tool has more steps to it, and by default the “report to Facebook” option isn’t selected. It also prompts users to check the community standards before reporting content. Most interesting is the final message after something has been reported, this message provides an apology for the users experience.

This new tool seems to be designed primarily to make reporting more time consuming and to discourage the use of the reporting functionality. That is not necessarily a bad thing if the number of false reports is too high. The main problem, however, isn’t a flood of reports causing delay, rather it is that Facebook’s reviewers aren’t able to correctly identify hate speech.  Until that problem is fixed, no amount to tweaking the software will solve the problem.

Here are the steps you now go through to report an image (note that reporting a profile picture is a little different and involves far less steps).

Step 1: Where the user is asked to designate whether the problem relates to Privacy, Harassment, Inappropriate Content or Spam.

Step 2 (for inappropriate content): A diversion where the user is encouraged to deal with the problem themselves and not to bother Facebook.

Step 3 (assuming you tick report to Facebook): The user is again asked if the problem might be Spam. Some other options are provided as well, these don’t quite match the terms of service but they are close. The terms of service say “You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence”.

Step 4: The user is given the option to NOT make the report. Are you sure you have read Facebook’s Community Standards? This screen could be greatly improved if it gave a summary of what exactly was being report. That said, there is really no reason for this step to exist at all, other than making the process more onerous and perhaps getting rid of a few pesky users who might have encountered content that breached the terms of service and may now want to waste Facebook’s time by reporting it.

Step 5: If the user has not been put off by now, Facebook attempts to placidity them. Of course reports are taken most seriously by Facebook. They also commit to removing the photograph if it breached their Community Standards.

The Community Standards are guidelines to explain the “Statement of Rights and Responsibilities” which are the real terms of service, that is the contract between Facebook and their users. Facebook really should be judging things based on the Terms of Service, not the Community Standards document. The Community Standards document itself makes this pretty clear when it refers to “terms” the Statement of Rights and Responsibilities lives at the web address: http://www.facebook.com/legal/terms and says “This Statement of Rights and Responsibilities (“Statement,” “Terms,” or “SRR”) derives from the Facebook Principles, and is our terms of service that governs our relationship with users and others who interact with Facebook.”

Facebook needs to ensure reporting is quick and easy, but most importantly, it needs to ensure that reports are properly handled and that the management of reports is subject to a quality control system. A petition calling for these changes can be signed here.